2900XT close in price to the 8800GTS...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: swtethan
Originally posted by: cmdrdredd
Originally posted by: Piuc2020
I don't understand why people whine so damn much about drivers...

You guys all live in a fantasy world.

Because when the game you play runs at 10fps because the drivers don't work correctly, it becomes a problem.

never had a game do that


<--- ultimate 64

dredd, you are kind of exaggerating this bit. NV has vastly improved Vista32 and 64 drivers. They are not perfect, and still have some issues, but much better off then they were.

@Tuteja: AMD is talking up a 2950pro. Thank you for answering for Marty, but I wanted to hear from him on this.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
Well, this topic went much further than what I expected. I admit I have read some information here and there, but I'm not really up to date with everything.

For the one who asked, I have no intention at all to change my monitor. I just like it, period. I don't need to.

You guys reckon then that a 8800GTS 320 would be good enough for me? For say... one and a half years at most. I'd be playing at 1280x960 6xAA at most, that's good enough for me. I'll be buying an X2/Opteron soon too, and an extra GB of RAM. Or I should just get the 2900XT? I'm willing to save a bit more, and I've seen some tests where the 2900XT is faster... correct me if I'm wrong guys!

EDIT: About what Tuteja said, well... not even I had thought of that! And it seems a fair point that draws me closer to the ATI.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: keysplayr2003
Originally posted by: swtethan
Originally posted by: cmdrdredd
Originally posted by: Piuc2020
I don't understand why people whine so damn much about drivers...

You guys all live in a fantasy world.

Because when the game you play runs at 10fps because the drivers don't work correctly, it becomes a problem.

never had a game do that


<--- ultimate 64

dredd, you are kind of exaggerating this bit. NV has vastly improved Vista32 and 64 drivers. They are not perfect, and still have some issues, but much better off then they were.

@Tuteja: AMD is talking up a 2950pro. Thank you for answering for Marty, but I wanted to hear from him on this.

You're not reading what I say. LOOK AT THE GAMES YOU PLAY. ok so I had to shout...

The games I play have huge issues (i.e don't work) with the 8800 series. That's all. Gotta take into account what you play more than what is faster IMO.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
About the "look at the games you play" point... I just don't know what I'll be playing next year! I don't even know what games are coming up besides UT3 and Crysis. Dunno if that's the best criteria to choose a card, to be honest. With that in mind, I should be looking at numbers, and at the card with the most potential and raw power. That should draw me to the ATI. Anyone would like to add anything?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I agree that the 320MB GTS would be a good card for you. There is a very good chance that you would never need the extra 320MB of memory. It's cheaper and better than the 2900XT (in your case with the lower screen resolution).

IMO the 2900XT runs too hot and uses up too much power to be considered viable. You'll probably need a PSU with a rating 100w higher than the 8800 cards need. Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

The GTS consistantly outperforms the 2900XT from what I have seen. Granted, they are usually pretty close and the 2900XT wins a few tests, but really the 8800 is the superior card.

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
Good points SickPoint. I'll take them in consideration. :D

I saw some benchmarks where the 2900XT seemed faster. Guess I'll dig a bit further, and check the low res results.

Thanks again!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
I agree that the 320MB GTS would be a good card for you. There is a very good chance that you would never need the extra 320MB of memory. It's cheaper and better than the 2900XT (in your case with the lower screen resolution).

IMO the 2900XT runs too hot and uses up too much power to be considered viable. You'll probably need a PSU with a rating 100w higher than the 8800 cards need. Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

The GTS consistantly outperforms the 2900XT from what I have seen. Granted, they are usually pretty close and the 2900XT wins a few tests, but really the 8800 is the superior card.

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

Nah the PSU issue is not an issue. Most good 500w PSUs will run a 2900xt.
You're not gonna immediately see your case temp rise, and your power bill jump from using an HD2900xt either. That's just utter BS to be honest. Unless you really have poor caseflow. The PSU does not need the 8pin connector, not even for overclocking. You can use 3rd party tools to overclock the card rather than the catalyst control center. You just need 2x 6pin.

I'm saying look at games you play currently Marty. You don't know what's comming sure, but if you play a certain game for 99% of your gaming right now. Say you played QuakeWars or whatnot. You want the card that performs best in that game. That's what I look at. I don't buy a card planning to play every game under the sun for 3 years. That's silly, but think about this. If you play at 1280x1024 or some resolution below 1600x1200 then I doubt very highly that any current card (8800 or 2900xt) would not be able to play the game adequately. All the benchmarks I ever see these days are for extremely high resolutions beyond what I play at. That's fine to see what card is the fastest, but tells me nothing about how a game performs at the resolution my monitor can handle.

P.S. if you say "I dunno what I'll be playing in a year and I want the card to play those games." then you should not even buy either card now, you'd be better off waiting longer, then guess what? There will be another card comming out.
 

Marty502

Senior member
Aug 25, 2007
497
0
0
Thank you cmdrdredd!

By each post and review I'm reading, I'm leaning closer and closer to the 8800GTS 320 again. Hell, even 1024x768 2xAA is acceptable to me sometimes, depending on the game. I'll end up CPU limited with a 2900XT or a 8800GTS 320 even if I upgrade to a dual core. Don't think I'll really need that extra RAM and power. Plus it's cheaper, and I'll have to change my PSU no matter what card I get (only 18A on the 12V rail) so there's no disadvantage, really.

Thanks again man. :D
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Marty502
Thank you cmdrdredd!

By each post and review I'm reading, I'm leaning closer and closer to the 8800GTS 320 again. Hell, even 1024x768 2xAA is acceptable to me sometimes, depending on the game. I'll end up CPU limited with a 2900XT or a 8800GTS 320 even if I upgrade to a dual core. Don't think I'll really need that extra RAM and power. Plus it's cheaper, and I'll have to change my PSU no matter what card I get (only 18A on the 12V rail) so there's no disadvantage, really.

Thanks again man. :D

That's cool. Yeah get a good PSU too because it's an important part of overall system stability.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
check prices and compare. the 2900xt is probably a little bit better in the long haul than the 8800 gts 640, but it is also usually priced higher. if you can get a deal like $310 that I saw at dell (sadly it ended 8/31) then you're probably better off with the 2900xt. If you're stuck paying closer to $400 for the 2900xt vs 260 or so for the 8800gts 320, then it's a no-brainer to save the $140.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: swtethan
Originally posted by: cmdrdredd
Because when the game you play runs at 10fps because the drivers don't work correctly, it becomes a problem.
never had a game do that
I have:

Click #1
Click #2

These two games are currently unplayable on G80 hardware and I've been waiting for a fix since last November. If I drop a 7900 GTX into the same system using the same driver they function perfectly.

To reiterate what was said earlier, nVidia make awesome hardware but their driver support sucks balls.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: SickBeast

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

You were really trying in your whole post, but this "point" is just laughable.

Care to tell us how much per year someone would save? Lets see some hard factual numbers here, not some number you think up.

 

mruffin75

Senior member
May 19, 2007
343
0
0
I know I'm going to get flamed for this...but what about a 2600? (or whatever the nvidia equivalent is if you lean that way...8600?)..

I think you'd be CPU limited with a 8800 or 2900... might as well keep your system balanced (and save $$$ for the "next round" of cards..)..

I wish I'd bought a couple of 2600's instead of the single 2900 (don't get me wrong, the 2900 is great, but for my system, which is CPU restricted, the 2600 would've been fine.. and would've played most games fine as long as I didn't want to run them at some extreme resolution)..
 

Marty502

Senior member
Aug 25, 2007
497
0
0
To be honest, I think the 8600 wouldn't be a good choice. I can spend the extra dough, and I'd most probably have to settle back on AA and AF on the upcoming games, something I really look forward NOT to do.

As a reference, after the VGA I'll buy an Opteron 170 and overclock the crap out of it.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
Originally posted by: SickBeast

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

You were really trying in your whole post, but this "point" is just laughable.

Care to tell us how much per year someone would save? Lets see some hard factual numbers here, not some number you think up.

Trying to what, exactly? What is he trying to do?

And then:
Does a 2900XT require and use more power than a 8800GTS320? (yes) (no)
Does a 2900XT run warmer than an 8800GTS320? (yes) (no)

Aside from an exact monetary figure of how much money a given user would save/lose going with either card, one would have to agree that it would cost $x.xx per year more to run a card that requires and used more power. Is it neglidgable? It sure could be. Is it substantial? It sure could be.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: Marty502
To be honest, I think the 8600 wouldn't be a good choice. I can spend the extra dough, and I'd most probably have to settle back on AA and AF on the upcoming games, something I really look forward NOT to do.

As a reference, after the VGA I'll buy an Opteron 170 and overclock the crap out of it.

I guess it just depends on how much money you want to spend on it.. I've found that my personal preference is to buy a mid-range card and not worry about AA and AF that much (something that'll play nicely at 1280x1024 with no AA/AF would be fine). Mind you, nicely doesn't mean it has to be running at 100fps..and I'm not a "hard-core" gamer.. I've found that my A64 X2 4600 and an X1600 Pro runs CounterStrike:Source at (I think) 1024x768 quite acceptably...I'm not sure if there's any AA/AF applied, as I haven't bothered to check the settings in a while, as it runs quite smoothly for me..of course it'd be a lot better if I transplanted the 2900XT into that machine, but for less than $100, the X1600 does it's job just fine...

Maybe I'll exchange the 2900 for a 2600 someday..sorry...getting off-topic now..

EDIT: I run CS:S at 1024x768 with 4x AA and 8x AF... and gets around 70-80 fps on a X1600 Pro
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: cmdrdredd
Originally posted by: swtethan
Originally posted by: cmdrdredd
Originally posted by: Piuc2020
I don't understand why people whine so damn much about drivers...

You guys all live in a fantasy world.

Because when the game you play runs at 10fps because the drivers don't work correctly, it becomes a problem.

never had a game do that


<--- ultimate 64

I said before, not everyone and not in every game. The issue of texture memory being eaten up and then the fps dropping steadily to below playable levels remains to be fixed. Nvidia said it should be fixed by this month. Here's hoping. It affects one of the games I regularly play, Final Fantasy XI. An older game, but one that should not run at 10fps on the newest hardware.

Pretty sure the texture memory problem is being fixed on the OS side with a hotfix. Related to the problem with the Vista WDDM eating up massive amounts of system RAM.

As for older games and support....if you buy a new piece of hardware you should go in with the expectation that older games released on an older OS may or may not have full backwards compatibility. If you buy a PS3, even if it says it supports most PS2 titles, are all PS2 titles going to play as well on a PS3? No.

FFXI works fine with a G80 in XP. Other than the menu overlay problem fixed in the 100-series drivers, performance was flawless and locked at 30 FPS at 1920x1200 (although still dropped to mid-teens in Dynamis, Aery etc.) Honestly not a huge deal considering the game is capped at 30 FPS anyways with a poorly performing PS2 engine to begin with.

There were problems with G80s and Vista when Vista first released. Haven't reloaded FFXI since I moved to Vista, so I'm not sure if the problems still exist. When I stopped playing, SE still did not officially support Vista but I wouldn't be surprised if its as much a client/engine problem as it is an OS/Driver problem.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: SickBeast
I agree that the 320MB GTS would be a good card for you. There is a very good chance that you would never need the extra 320MB of memory. It's cheaper and better than the 2900XT (in your case with the lower screen resolution).

Agreed.


Originally posted by: SickBeast
IMO the 2900XT runs too hot and uses up too much power to be considered viable. You'll probably need a PSU with a rating 100w higher than the 8800 cards need.

Actually, the 2900XT pulls about 90w more only against the 8800GTS 320. Comparing it to an 8800GTS 640 this narrows to 65w and finally drops to only 17w against an 8800GTX.


Originally posted by: SickBeast
Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

You do not need to have an 8-pin connector in order to use the 2900XT. The only time you need an 8-pin connector is if you intend to overclock the card and void your warranty. The 2900XT works just fine with the 6-pin connectors.


Originally posted by: SickBeast
The GTS consistantly outperforms the 2900XT from what I have seen. Granted, they are usually pretty close and the 2900XT wins a few tests, but really the 8800 is the superior card.

I thought that the newer drivers put the 2900XT performance above the 8800GTS 640 and within reach of the 8800GTX. Or did Nvidia also have some major driver improvements which placed the 2900XT back to 8800GTS levels?


Originally posted by: SickBeast
Also keep in mind that you will save money on electricity by going with the more efficient 8800

I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.


Originally posted by: SickBeast
plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

Since both cards have DHES coolers that vent their hot air to the outside of the case, I don't think either one will raise case temps. In fact, they might even help lower temps on a poorly ventilated case.
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: Creig
Originally posted by: SickBeast
Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

You do not need to have an 8-pin connector in order to use the 2900XT. The only time you need an 8-pin connector is if you intend to overclock the card and void your warranty. The 2900XT works just fine with the 6-pin connectors.

Overclocking with the 8-pin connector doesn't void the warranty.. overclocking for a while now with ATI cards has been accepted..

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: tuteja1986
If you see old gpu benchmark , The X800 and X1900 blow away its nvidia counter part in new games like bioshock.

That's largely because X800/850 and X1900 architectures were designed from the ground up to be faster in shader intensive applications vs. texture intensive apps. As more applications are focusing more and more on shaders (and not like Bioshock uses super high resolutions textures) you'll see ATI's old gen cards outperform similar gen NV cards. Sure drivers make an impact but in this case it's mostly related to architecture (i.e. 48 pixel shader pipelines in X1900 series for instance vs. 24 for G70).
 

Sonikku

Lifer
Jun 23, 2005
15,906
4,930
136
The new cards are just on the horizon, with Nvidia cards from last year seemingly the same price now as they were then. I'd wait, but that's just me.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: mruffin75
Originally posted by: Creig
Originally posted by: SickBeast
Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

You do not need to have an 8-pin connector in order to use the 2900XT. The only time you need an 8-pin connector is if you intend to overclock the card and void your warranty. The 2900XT works just fine with the 6-pin connectors.

Overclocking with the 8-pin connector doesn't void the warranty.. overclocking for a while now with ATI cards has been accepted..

Are you sure? I thought it was still up to the individual companies whether or not they wanted to cover overclocking as a warranty item or not.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I]Originally posted by: Creig[/i]
I]Originally posted by: SickBeast[/i]
I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.




where do you live that you get .05c per kw hr??? that needs to double or maybe even, uh, TRIPLE! that would be 14.4x3 = 43.2 cents per month. that's a coke every two months (12 oz can). would you deny sickbeast his coke???

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Creig

Originally posted by: SickBeast
Also keep in mind that you will save money on electricity by going with the more efficient 8800

I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.

Yep. Anyone who thinks its even a factor, is only fooling themself. That, or is very bias and a "fan boy." Trying way too hard. For one, it can be much less of a 90w difference, if the 320 is a factory overclocked model, making the already minuscule amount, even less. Most sites already show it at about 50-60w difference. Also, at idle (where everyones PC is most of the time) the 2900XT uses less than the GTX and the Ultra. So you can make the argument that some of NV cards cost more over a years time. But that would be silly too, because it would so small, it wouldnt even matter.