nVIDIA PRICE DROP-7800GTX512 $399, 7900GT $299, 7900GTX512 $499

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jules

Lifer
Oct 9, 1999
15,213
0
76
Originally posted by: coldpower27
Originally posted by: MyStupidMouth
Originally posted by: coldpower27
Originally posted by: Zstream
I do not doubt that the 7900 series will be better, if not well then that would be silly to release it. Just the way it is now though if Nvidia does not change is architecture to perform well with shaders then the point is mute at best. Raw MHZ and pipelines will eventually die out.

To put it shortly Nvidia needs more shader processors or ALU's like the 1900 series which has 48 shader units.

This is always a good read if you like programming crap. DirectX shader pdf

once again I may be rambling but unified architecture seems to be the better route.

Not necessarily true, if they simply tie the X1900 Series, and they cost cheaper to produce then the competition, then they can win through more agressive pricing. That is also a victory.
which is unlikley.

It is likely if the 7900 Series is a simple optical shrink of the 7800 Core.

Price? No.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Zstream
I do not doubt that the 7900 series will be better, if not well then that would be silly to release it. Just the way it is now though if Nvidia does not change is architecture to perform well with shaders then the point is mute at best. Raw MHZ and pipelines will eventually die out.

To put it shortly Nvidia needs more shader processors or ALU's like the 1900 series which has 48 shader units.

This is always a good read if you like programming crap. DirectX shader pdf

once again I may be rambling but unified architecture seems to be the better route.

I dont think you know what your talking about. NVs shaders as of now are just fine. For example, to see ones shader performance, we see that NVs 24 pixel shaders are a match for 48 ATi pixel shaders in the R580. (shown by 16x12 bench no AA/AF). NV always had the edge, and plus G70 architecture has 2 ALUs per pipeline. 2 Full ALUs. IF NV works on efficent AA, then NV cards will dominate shader heavy games. They always had the edge, even now with shader performance.

Unified architecture really has no speed advantage as of now, but might makes sense for vista due to the nature of DX10.

Im sure NV has tweaked more on the G70 core, because they had lots of time to make the core even more efficent. Im hoping for improved AA efficency and FX series style AF.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Cygnus X1
http://www.xbitlabs.com/news/video/display/20060223090220.html

Glad I didn't buy that eVGA motherboard combo ;) This industry is cut-throat wow!

I'm glad I did... my reciept shows my 7800GT cost me $360... a 7900GTX will cost $500... so I'll trade mine in and get a 7900GTX for $140, minus the $20 rebate on my 7800GT. $120+trade-in for a 7900GTX... plus I have a brand new nForce4 SLI motherboard... not a bad deal.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Ackmed
The simple fact is, the X1800XT and even the X1900XT's are not that much more than the 256MB GTX's. In fact, the X1800XT is even cheaper than the 256MB GTX. Yet you claim the price/performance ratio is terrible on the XT's, but not on the GTX?

there are dozens of brands of GTX for around $450; there's 1 x1800xt i can find for $450 - overall their pricinig seems to hover around $500 (and for that price, it doesn't compete well with the x1900xt). if i were to choose one or the other, i'd choose the XT (even at a little more cost) over a GTX, but the x1800xt is EOL, and was never a focus of this discussion.

f you want to bring in the 512mb gtx, i never claimed it was NOT ridiculous, and even went further and claimed it was about as close to vaporware as you can get (i think i'm correct to say the situation is/was as bad or worse than the x800xtpe).

furthermore, there is some hesitation with any of the ati products in dual card scenarios. while originally i never considered it an option, my opinion now has changed to "if you're gonna get pcie, you might as well upgrade all the way to an sli board"... my thinking being that, it would be silly not to give yourself that option of dropping in a second, even if that's not your original intention. unfortunately, the poor reviews of xfire mboards (the weakness iirc is mainly the southbridge), lack of selection, and a solution overall more cumbersome than nvidia, causes hesistation.

that and the fact that sli'd 7800s draw less power (and therefore generate less heat) and are quieter than the x1900xt further strenghten nvidias position when multiple cards are in the mix -- and again, why not give yourself that option even if initially you don't plan on going dual cards?

You are right (I think ) about the heatsink. AT did a X1900 roundup, and the HIS XTX had the AC Silencer heatsink on it, not the XT. Maybe it does, AT didnt test it, so I cant say. HIS's website doesnt even show the heatsink that AT had.

yes, to be fair, AT incorrecty stated that in one of his articles -- going as far as describing the difference even tho the accompanying pictures showed (correctly) the hsf were identical, lol...

look -- imo the gt is a better "bang for the buck" than the gtx. if the x1800xl had the same advantage over the gt as the 1800xt had over the gtx, i would state the XL is the better option (especially now that they are barely over $300, so almost a full $150 less than they were at launch 4 mo ago.)

but my point (and this topic of this thread) is the price drop of the upcoming 7900s and how that compares to the x1900s, NOT how 7800s compare, and i just don't understand why you keep bringing the 7800s into this discussion....
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: coldpower27
Originally posted by: RussianSensation
If we ever look back on overall performance in games:

Geforce 3 Ti 200/500 was about the same speed as Radeon 8500
Geforce 4 and 5 both lost to Radeon 9700 (only time 1 company really had a heads up on another).
Geforce 6 = X800 series
Geforce 7 was beating X800 for 6 months and then = X1800

Looking back, I am really starting to wonder if arguying about which videocard company did better during a particular generation is rather pointless. If we go back in history during those generations, then forward time by 1 year when new "generation" games came out and compare benches, for the majority of games, both companies had cards that provided equally satisfactying or equally dissatisfying gameplay. For example, SM3.0 features of 6800 series, that were argued on these forums for a year, practically made no difference. I myself stated that X800's better performance in shader intensive games will make them a better buy for the future, yet that didn't give those cards any tangible advantage in FEAR, COD2, or BF2 either. X1800 is statistically faster than 7800 series, but was it worth waiting for whole 6 months?

The bottom line is, imo, the only thing that matters are generational leaps. The only question to ask is it worth upgrading from 1 generation to another and what's a good price/performance ratio. Looking at most benches today (besides 9800Pro vs. 5900 series and its abysmal DX9 performance), I doubt that over the last 4 years, one can say that because he/she bought an Nvidia vs. ATI card from the same generation, they were able to play games for 6 months longer without slowdowns had they not made that choice. Perhaps what is really important are timing, pricing and availability because at the end of the day having 1 company outpeform another by 20% 6 months later means absolutely nothing. The opportunity cost for a real gamer is too high to to warrant the waiting period.

Just my 2 cents.

Since when does the Geforce 4 TI Series lose to the Radeon 9700 Series? The Geforce 4 TI line was launch in Feb 2002, while the Radeon 9700 Pro didn't come out till July/August time frame.... it was the top card for several months till the release till Radeon 9700 Pro, where ATI then held the lead for awhile. Your also forgeting the original Radeon 64 DDR against the Geforce 2 GTS.


well, it was actually the original radeon32 DDR. i think i still have one somewhere heh...

to russian - the radeon 8500 was never a better performer than gf3 (i have both of those cards too)..

and yes, the gf4 held the crown for some months (still have one of those as well) untill the 9700 (and have one of those also lol) came out late that summer.

but yea russian, other than that i agree with the idea you're tryign to convey.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: CaiNaM
Originally posted by: coldpower27
Originally posted by: RussianSensation
If we ever look back on overall performance in games:

Geforce 3 Ti 200/500 was about the same speed as Radeon 8500
Geforce 4 and 5 both lost to Radeon 9700 (only time 1 company really had a heads up on another).
Geforce 6 = X800 series
Geforce 7 was beating X800 for 6 months and then = X1800

Looking back, I am really starting to wonder if arguying about which videocard company did better during a particular generation is rather pointless. If we go back in history during those generations, then forward time by 1 year when new "generation" games came out and compare benches, for the majority of games, both companies had cards that provided equally satisfactying or equally dissatisfying gameplay. For example, SM3.0 features of 6800 series, that were argued on these forums for a year, practically made no difference. I myself stated that X800's better performance in shader intensive games will make them a better buy for the future, yet that didn't give those cards any tangible advantage in FEAR, COD2, or BF2 either. X1800 is statistically faster than 7800 series, but was it worth waiting for whole 6 months?

The bottom line is, imo, the only thing that matters are generational leaps. The only question to ask is it worth upgrading from 1 generation to another and what's a good price/performance ratio. Looking at most benches today (besides 9800Pro vs. 5900 series and its abysmal DX9 performance), I doubt that over the last 4 years, one can say that because he/she bought an Nvidia vs. ATI card from the same generation, they were able to play games for 6 months longer without slowdowns had they not made that choice. Perhaps what is really important are timing, pricing and availability because at the end of the day having 1 company outpeform another by 20% 6 months later means absolutely nothing. The opportunity cost for a real gamer is too high to to warrant the waiting period.

Just my 2 cents.

Since when does the Geforce 4 TI Series lose to the Radeon 9700 Series? The Geforce 4 TI line was launch in Feb 2002, while the Radeon 9700 Pro didn't come out till July/August time frame.... it was the top card for several months till the release till Radeon 9700 Pro, where ATI then held the lead for awhile. Your also forgeting the original Radeon 64 DDR against the Geforce 2 GTS.


well, it was actually the original radeon32 DDR. i think i still have one somewhere heh...

to russian - the radeon 8500 was never a better performer than gf3 (i have both of those cards too)..

and yes, the gf4 held the crown for some months (still have one of those as well) untill the 9700 (and have one of those also lol) came out late that summer.

but yea russian, other than that i agree with the idea you're tryign to convey.

Well, I am quite sure that ATI released both 32 and 64MB version of the original Radeon, in both SDR and DDR configurations I might add.
http://www.anandtech.com/video/showdoc.aspx?i=1281

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Cookie Monster
Originally posted by: Zstream
I do not doubt that the 7900 series will be better, if not well then that would be silly to release it. Just the way it is now though if Nvidia does not change is architecture to perform well with shaders then the point is mute at best. Raw MHZ and pipelines will eventually die out.

To put it shortly Nvidia needs more shader processors or ALU's like the 1900 series which has 48 shader units.

This is always a good read if you like programming crap. DirectX shader pdf

once again I may be rambling but unified architecture seems to be the better route.

I dont think you know what your talking about. NVs shaders as of now are just fine. For example, to see ones shader performance, we see that NVs 24 pixel shaders are a match for 48 ATi pixel shaders in the R580. (shown by 16x12 bench no AA/AF). NV always had the edge, and plus G70 architecture has 2 ALUs per pipeline. 2 Full ALUs. IF NV works on efficent AA, then NV cards will dominate shader heavy games. They always had the edge, even now with shader performance.

Unified architecture really has no speed advantage as of now, but might makes sense for vista due to the nature of DX10.

Im sure NV has tweaked more on the G70 core, because they had lots of time to make the core even more efficent. Im hoping for improved AA efficency and FX series style AF.



nVidia hasn't done well in AA/AF tests since this architectures inception during the FX series. It's not going to start doing well now just because they shrunk it down to 90 nm. Furthermore, a 100 mhz clock increase on the core will not come close to a 18% increase in performance over a 512 GTX since it has a decrease in bandwidth. It will give a bit better shader throughput and slightly higher performance when using AF but it will still be limited in situations that utilize high resolution and AA.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
You are assuming that the 100mhz decrease in memory clock will affect performance, but it wont. If they did drop it, the ram timings must be tighter compared to looser timings for higher clock. Theorectically tighter timings give you better performance. But in the end, i dont think it will be noticeble in real life.

I never said anything about AA/AF improving because of 90nm lowk. I just pointed out that since the G71 is a refresh, NV engineers would have defitnately looked at their hardware and started tweaking the G70 core. They are pumping their full cylinders, profit margins are higher per quarter, reputation is growing. I dont think they will "just" release a die shrinked core. Thats why i think they will look at many ways currently to reduce the performance hit using AA, and improve AF somewhat.

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: CaiNaM
Originally posted by: Ackmed
The simple fact is, the X1800XT and even the X1900XT's are not that much more than the 256MB GTX's. In fact, the X1800XT is even cheaper than the 256MB GTX. Yet you claim the price/performance ratio is terrible on the XT's, but not on the GTX?

there are dozens of brands of GTX for around $450; there's 1 x1800xt i can find for $450 - overall their pricinig seems to hover around $500 (and for that price, it doesn't compete well with the x1900xt). if i were to choose one or the other, i'd choose the XT (even at a little more cost) over a GTX, but the x1800xt is EOL, and was never a focus of this discussion.

f you want to bring in the 512mb gtx, i never claimed it was NOT ridiculous, and even went further and claimed it was about as close to vaporware as you can get (i think i'm correct to say the situation is/was as bad or worse than the x800xtpe).

furthermore, there is some hesitation with any of the ati products in dual card scenarios. while originally i never considered it an option, my opinion now has changed to "if you're gonna get pcie, you might as well upgrade all the way to an sli board"... my thinking being that, it would be silly not to give yourself that option of dropping in a second, even if that's not your original intention. unfortunately, the poor reviews of xfire mboards (the weakness iirc is mainly the southbridge), lack of selection, and a solution overall more cumbersome than nvidia, causes hesistation.

that and the fact that sli'd 7800s draw less power (and therefore generate less heat) and are quieter than the x1900xt further strenghten nvidias position when multiple cards are in the mix -- and again, why not give yourself that option even if initially you don't plan on going dual cards?

$400.08 shipped from newegg for a retail X1800XT. $399.99 shipped from ZZF for a retail X1800XT. Overall pricing is $500 for a X1800XT? Yeah, if you dont actually try to look. It took me 1 min to look on two of the best hardware sites. As I said, cheaper than a GTX. So if you are going to claim that the X1800XT's have a bad price/performance ratio, you need to claim the GTX does.

I never said anything about the 512MB GTX, nor multi card. I was just pointing out the bias in your comment, when you claimed the X1800XT's price/performace ratios was "ridiculous", when its better than the GTX's. SLI'd 512MB's draw more power than X1900's do. And the heat from a X18/900 card is being exhausted out the back, not swirling around the case.

And the 8500 was faster than the GF3. It wasnt until the GF3 Ti series came out, that the GF3 overtook most of the benchmarks. Not all, but most.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Cookie Monster
You are assuming that the 100mhz decrease in memory clock will affect performance, but it wont. If they did drop it, the ram timings must be tighter compared to looser timings for higher clock. Theorectically tighter timings give you better performance. But in the end, i dont think it will be noticeble in real life.

I never said anything about AA/AF improving because of 90nm lowk. I just pointed out that since the G71 is a refresh, NV engineers would have defitnately looked at their hardware and started tweaking the G70 core. They are pumping their full cylinders, profit margins are higher per quarter, reputation is growing. I dont think they will "just" release a die shrinked core. Thats why i think they will look at many ways currently to reduce the performance hit using AA, and improve AF somewhat.


Ok how will they accomplish a smaller AA hit? I doubt they changed their memory controller since it's been virtually the same since the 6800 series and we know increasing the number of ROPs won't help either. So tell me, what could they have done? The only rumor was a possible tweak to the architecture's FP units over "previous generation" products but DailyTech made it quite clear that was in reference to the 6800 series (2x MADD operations per clock). Of course AF performance will increase slightly with the 100 mhz increase in clock speed and theoretically they could use tighter timings but the 512 GTX performance suggests they were already using tight timings at a higher memory clock speed.
 

Leper Messiah

Banned
Dec 13, 2004
7,973
8
0
wow.

some of you really need to get some rabies shots.


Coldpower27, even a 15% boost in performance at that high of AA/AF level is doubtful. nVidia's AA performance is already worse than ATi's and the 100 MHz reduction in mem speed is going to hurt that a bit.

The pricing is awesome though if it comes out to be like that. Kudos to nVidia for making pricing a little more realistic, ATi will probably follow suit also just to keep sales. In the end we all win.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
yea! keep em down

i remember buying the 9800 pro for only $200

the high end cards are now like 600+ dollars
just ridiculous
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: moonboy403
if we know how they do it...we'd be engineering the cards ourselves joker!

QFT. Joker im not an engineer yet so i cant tell you :)

Breaking news: Cookie Corp has released their next flagship card called Cookie 8900 GTX Monster edition. Homemade cookies are bundled with the card, and is delivered by cookie monsters across the globe. :laugh:
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
All for $99, these monsters are equipped with latest Choc Chip ram including a core clock of flour/wheat mhz. Further more, you can step up for FREE without trading back your card! FREE COOKIE FOR EVERYONE! :D
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Leper Messiah
wow.

some of you really need to get some rabies shots.


Coldpower27, even a 15% boost in performance at that high of AA/AF level is doubtful. nVidia's AA performance is already worse than ATi's and the 100 MHz reduction in mem speed is going to hurt that a bit.

The pricing is awesome though if it comes out to be like that. Kudos to nVidia for making pricing a little more realistic, ATi will probably follow suit also just to keep sales. In the end we all win.

We will see, to me enough fillrate can conpensate memory bandwidth weakness, I don't really think a 5.9% reduction in memory bandwidth, is going to castrate it much if at all 15% to me seems quite reasonable as shader power plays a much larger role typically then memory bandwidth does, in most scenarios. The 7900 launch is fairly near we will see soon enough.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
This thread is making me hungry! :(

:D

Btw, I agree with "even if they only tie the X1900XT/XTX, they win if they price cheaper". That's obvious and I think it's the path they're taking with the G71 refresh of the G70. It's not going to blow the X1900XT/XTX out of the water, but I expect it will increase the 7800's leads in OGL, will win some of tests where the 7800GTX512MB was closely tied with the X1900XT/XTX, and will start to tie ATI in games where the 7800s lost. If they can do even this, plus come out cheaper, it's a winner.

If they can do it with a quietly cooled version from Asus or Gigabyte or whomever, it's also going to be my next card. :) (Probably the 7900GT not the GTX, but it depends on the final pricing and the price at the time I buy, which will be after the X1900XL comes out near the end of March.)
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: moonboy403
yea! keep em down

i remember buying the 9800 pro for only $200

the high end cards are now like 600+ dollars
just ridiculous


You must have bought it used, or when it was old. It was $400 when it first came out. 9800XT was $500 MSRP.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: yacoub
This thread is making me hungry! :(

:D

Btw, I agree with "even if they only tie the X1900XT/XTX, they win if they price cheaper". That's obvious and I think it's the path they're taking with the G71 refresh of the G70. It's not going to blow the X1900XT/XTX out of the water, but I expect it will increase the 7800's leads in OGL, will win some of tests where the 7800GTX512MB was closely tied with the X1900XT/XTX, and will start to tie ATI in games where the 7800s lost. If they can do even this, plus come out cheaper, it's a winner.

If they can do it with a quietly cooled version from Asus or Gigabyte or whomever, it's also going to be my next card. :) (Probably the 7900GT not the GTX, but it depends on the final pricing and the price at the time I buy, which will be after the X1900XL comes out near the end of March.)

QFT. Any other improvements on features will just be icying on the cake really.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Cookie Monster
Originally posted by: yacoub
This thread is making me hungry! :(

:D

Btw, I agree with "even if they only tie the X1900XT/XTX, they win if they price cheaper". That's obvious and I think it's the path they're taking with the G71 refresh of the G70. It's not going to blow the X1900XT/XTX out of the water, but I expect it will increase the 7800's leads in OGL, will win some of tests where the 7800GTX512MB was closely tied with the X1900XT/XTX, and will start to tie ATI in games where the 7800s lost. If they can do even this, plus come out cheaper, it's a winner.

If they can do it with a quietly cooled version from Asus or Gigabyte or whomever, it's also going to be my next card. :) (Probably the 7900GT not the GTX, but it depends on the final pricing and the price at the time I buy, which will be after the X1900XL comes out near the end of March.)

QFT. Any other improvements on features will just be icying on the cake really.
Or the lack of features an indication of price. I mean $50 less you get the same speed(?) and less features. Thats assuming ATI doesnt do any price-cuts.

Price wars FTW. :cool:
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Ackmed
Originally posted by: CaiNaM
Originally posted by: Ackmed
The simple fact is, the X1800XT and even the X1900XT's are not that much more than the 256MB GTX's. In fact, the X1800XT is even cheaper than the 256MB GTX. Yet you claim the price/performance ratio is terrible on the XT's, but not on the GTX?

there are dozens of brands of GTX for around $450; there's 1 x1800xt i can find for $450 - overall their pricinig seems to hover around $500 (and for that price, it doesn't compete well with the x1900xt). if i were to choose one or the other, i'd choose the XT (even at a little more cost) over a GTX, but the x1800xt is EOL, and was never a focus of this discussion.

f you want to bring in the 512mb gtx, i never claimed it was NOT ridiculous, and even went further and claimed it was about as close to vaporware as you can get (i think i'm correct to say the situation is/was as bad or worse than the x800xtpe).

furthermore, there is some hesitation with any of the ati products in dual card scenarios. while originally i never considered it an option, my opinion now has changed to "if you're gonna get pcie, you might as well upgrade all the way to an sli board"... my thinking being that, it would be silly not to give yourself that option of dropping in a second, even if that's not your original intention. unfortunately, the poor reviews of xfire mboards (the weakness iirc is mainly the southbridge), lack of selection, and a solution overall more cumbersome than nvidia, causes hesistation.

that and the fact that sli'd 7800s draw less power (and therefore generate less heat) and are quieter than the x1900xt further strenghten nvidias position when multiple cards are in the mix -- and again, why not give yourself that option even if initially you don't plan on going dual cards?

$400.08 shipped from newegg for a retail X1800XT. $399.99 shipped from ZZF for a retail X1800XT. Overall pricing is $500 for a X1800XT? Yeah, if you dont actually try to look. It took me 1 min to look on two of the best hardware sites. As I said, cheaper than a GTX. So if you are going to claim that the X1800XT's have a bad price/performance ratio, you need to claim the GTX does.

not sure whether you deliberately ignore what is being discussed, don't bother to actually read what was written, or you just lack comprehension.. but AGAIN, i never claimed any such thing.. you keep bringing the EOL x1800 into the discussion; it wasn't in mine.

I never said anything about the 512MB GTX, nor multi card. I was just pointing out the bias in your comment, when you claimed the X1800XT's price/performace ratios was "ridiculous", when its better than the GTX's. SLI'd 512MB's draw more power than X1900's do. And the heat from a X18/900 card is being exhausted out the back, not swirling around the case.

again, you are wrong on multiple points. i never made any such calims about the x1800, and therefore there is no bias in my statements.

sli'd gtx 512s may draw more power (tho i do know a single one consumes much less power), but i never stated sli'd 512s. sli'd gt's offer similar performace as a single x1900, comsumes less power, and generated less heat. that being said, i would take the single card over a dual setup given similar performace..

for whatever reason you have nothing better to do then take things out of context or outright change what was said in order to.. i guess satifsy yourself that it appears that way? guess only you really know....

And the 8500 was faster than the GF3. It wasnt until the GF3 Ti series came out, that the GF3 overtook most of the benchmarks. Not all, but most.

and no, it wasn't. wrong again. as 8500 was released, nvidia released new det drivers to ensure their benchmark leads, and later released the ti which continued to keep them ahead.

the 8500 was also arguably released a little too soon, as driver issues and such complicated matters more.

back to the subjects you keep changing the discussions to...

on the x1800xt, those are indeed nice finds, even for a discontinued product.. wonder why 99% are at $500 tho? are these gems, or are all the rest going to come down now that they're EOL'd..

at any rate, i have no problem saying that at those prices the "bang for the buck" is very good - tho again i will also reiterate that was not my point or this discussion at all. this was about the rumored 7900 prices and and current x1900 prices, so regardless of the fact you keep changing the discussion back to last gen products, i'll stick with the point i was making earlier - paying a premium of 50% or more for an additional 10-15% performance is not a good value at all, and IF (again this is conjecture/rumor at this point) the 7900GT comes in at $299msrp and offers 90% of the performance of the "flagship" cards, ANY card for 5-600 is not a very good value at all.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The X1900XT is about par with the 7800GTX 512mb.
While 2x7900GT (which is similiar to 7800GTX256mb SLi)will cost around the same as a single X1900XTX.

From techreport:
Quake4 20x15 4xAA 8xAF
7800GTX SLi: 50 fps
X1900XTX: 40.1
7800GTX 512: 42.8
X1900XT: 38.1

HL2:Lost Coast 20x15 HDR 16xAF
7800GTX SLi: 56.1
X1900XTX: 45.7
7800GTX 512: 45.4
X1900XT: 43.6

F.E.A.R 12x9 4xAA 16xAF
7800GTX SLi: min:29 avg:66
X1900XTX: min:26 avg:57.5
7800GTX 512: min:24 avg:44.4
X1900XT: min:26 avg:54.7

BF2 19x14 4xAA 16xAF
7800GTX SLi: min:51 avg:68.7
X1900XTX: min:47 avg:66.3
7800GTX 512: min:43 avg:61.2
X1900XT: min:41 avg:57.9

Guild Wars 19x14 4xAA 16xAF
7800GTX SLi: min:80 avg:121.2
X1900XTX: min:62 avg:86.2
7800GTX 512: min:71 avg:95
X1900XT: min:60 avg:79.9

2x7900GT will defitnately outperform a X1900XTX since a 7900GT is equivalent of 7800GTX clocked at 450/1300. Prices would be same as well.

Food for thought.