2900XT close in price to the 8800GTS...

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
oh and check this out... Medal of Honor Airborn. 8800gtx vs HD2900xt vs 8800gts 640MB. This game also uses the Unreal3 engine, but notice something? The ATi card spanks the GTX (although this is DX9 under XP). That tells me that Bioshock was poorly written and has alot of Nvidia favoritism. It uses the same engine.

This is a test done using the latest drivers under XP. SHould give a pretty good idea of what to expect overall from each card when running xp.

http://www.legionhardware.com/document.php?id=680&p=4

edit: Here is another review using the latest drivers for each card, but this time under Vista x64. This way you have a clear idea of what you can expect from this combination as well. (HD2900xt vs 8800GTS)

http://www.elitebastards.com/c...=view&id=455&Itemid=27
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?

I still want to know why you guys are suddenly playing down the importance of power consumption now, when in the past, it was a very big deal indeed. What has changed? Is electricity cheaper these days? Are PSU's a dime a dozen? What's going on here?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: mruffin75
Originally posted by: Creig
Originally posted by: apoppin

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin

apoppin, you cannot take action against someone who is responding to general comments made by somebody who happens to also be a moderator. If the comments being made are in direct relation to actions/warnings given by a moderator, then that is another matter. But moderators are also general members of the AT forums and should not be given any more or less latitude than is extended to everybody else while they are making non-moderator related posts.

I do not see where in Zstream's post that he is calling out keysplayr2003 because he is a moderator or any action he took while acting in a moderating role. Rather, it appears that the "comeback" is directed toward Zstream's assertion that keysplayr2003 tries to bait people while posting in a non-moderator capacity.



edit - Good God, I'm starting to sound like a lawyer.

Maybe you do sound like a lawyer (are you?!?! :) )... but you are correct...

I didn't see any issue with what Zstream said either.. Keys decided to get into this thread and post some comments, and they were replied to... just because he's a moderator doesn't mean we have to agree with them...

But back on topic (sort of!)..

Ah yes you do need a beefier power supply for the 2900XT (most of the time), but maybe in the process of buying the beefier power supply, you've actually purchased one that is more *efficient* than your previous 300-400W'er..

Then that could possibly cancel out any extra power draw from the 2900!! :)

Never said nobody had to agree with me. But they damn well better be civil and respectful to me or anyone else. That's the whole trick.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?

I still want to know why you guys are suddenly playing down the importance of power consumption now, when in the past, it was a very big deal indeed. What has changed? Is electricity cheaper these days? Are PSU's a dime a dozen? What's going on here?

I don't care? I have a 600watt PSU that can handle the card. Hell people have Quad SLI PSUs with 1000watts. You should not worry about power consumption much if you're talking performance. It's like a big V8 engine in a muscle car. The more powerful it is the more gas it burns. Obviously there is some give and take. Like a C2D using less power than certain AMD chips yet being faster. GTX vs HD2900xt in many games etc.

In all honesty, I choose what is the best for me and right now the HD2900XT was my best option given the fact that Nvidia still has not adressed the issue(s) that would affect me most. I don't care about saving money on my power bill, FPL already screws me and I'm paying upwards of $400 a month anyway.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: cmdrdredd
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?

I still want to know why you guys are suddenly playing down the importance of power consumption now, when in the past, it was a very big deal indeed. What has changed? Is electricity cheaper these days? Are PSU's a dime a dozen? What's going on here?

I don't care? I have a 600watt PSU that can handle the card. Hell people have Quad SLI PSUs with 1000watts. You should not worry about power consumption much if you're talking performance. It's like a big V8 engine in a muscle car. The more powerful it is the more gas it burns. Obviously there is some give and take. Like a C2D using less power than certain AMD chips yet being faster. GTX vs HD2900xt in many games etc.

In all honesty, I choose what is the best for me and right now the HD2900XT was my best option given the fact that Nvidia still has not adressed the issue(s) that would affect me most. I don't care about saving money on my power bill, FPL already screws me and I'm paying upwards of $400 a month anyway.

That's fine. What I asked was, what has changed?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Zstream
Originally posted by: SickBeast
Originally posted by: keysplayr2003
Originally posted by: Ackmed
Originally posted by: SickBeast

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

You were really trying in your whole post, but this "point" is just laughable.

Care to tell us how much per year someone would save? Lets see some hard factual numbers here, not some number you think up.

Trying to what, exactly? What is he trying to do?

And then:
Does a 2900XT require and use more power than a 8800GTS320? (yes) (no)
Does a 2900XT run warmer than an 8800GTS320? (yes) (no)

Aside from an exact monetary figure of how much money a given user would save/lose going with either card, one would have to agree that it would cost $x.xx per year more to run a card that requires and used more power. Is it neglidgable? It sure could be. Is it substantial? It sure could be.
Yeah I agree...what was I trying to do?

Keys is right; your posting style *is* quite combative.

I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

I don't know how much it would cost per year. To me the bigger issue is the fact that you need a beefier PSU for the X2900XT. Even 90w more can get quite expensive. I'm pretty sure a 700w PSU is at least $30 more than a 600w PSU.

Sick come on, this is a stretch and you know it. Keys is not right, he only tries to bait people and it shows. Considering that most of the 320mb cards are overclocked and when running in 2d mode the Nvidia GPU use more power. So?

I also doubt in the enthusiast market 15-25 cents a month is even noticable. And no, to run a 2900XT you can run it fine on a 500W PSU.

I wait to see the come back for this one...

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin
My comeback is that the true cost is at least $2 per month.

I'd like to see some evidence that the ATI cards use less power at idle. In the AT article I read, the 2900XT used more power than any nVidia card, at idle or at load (although the cards were closer when idling).

Your overclocked GTS comments are irrelevant. People also overclock the X2900XT, at which point its consumption is outrageous to the point that you need an 8-pin power connection to maintain stability (and your warranty in some cases).

For the record, I don't remember the last time I saw Keys bait someone (if ever). His 'what was he trying to do' comment was in response to Ackmed's baiting.

As for the PSU, if a card draws 90w more than another card under full load, it NEEDS a more serious PSU. There is no way around it.


 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: keysplayr2003
Originally posted by: cmdrdredd
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?

I still want to know why you guys are suddenly playing down the importance of power consumption now, when in the past, it was a very big deal indeed. What has changed? Is electricity cheaper these days? Are PSU's a dime a dozen? What's going on here?

I don't care? I have a 600watt PSU that can handle the card. Hell people have Quad SLI PSUs with 1000watts. You should not worry about power consumption much if you're talking performance. It's like a big V8 engine in a muscle car. The more powerful it is the more gas it burns. Obviously there is some give and take. Like a C2D using less power than certain AMD chips yet being faster. GTX vs HD2900xt in many games etc.

In all honesty, I choose what is the best for me and right now the HD2900XT was my best option given the fact that Nvidia still has not adressed the issue(s) that would affect me most. I don't care about saving money on my power bill, FPL already screws me and I'm paying upwards of $400 a month anyway.

That's fine. What I asked was, what has changed?

Nothing, never has. I never cared about how much power one card uses vs another. Only that it works for what I need it to work for.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: cmdrdredd
Originally posted by: Zstream
Ok well here goes.

http://www.neoseeker.com/Artic...ews/ati_2900xt/10.html

http://techreport.com/articles.x/12458/15

http://www.pcstats.com/article...?articleid=2159&page=5


I think that more people leave the PC idle then playing games.

lol I never knew that. Anyway like i said, I don't care about power consumption. It's all about a card actually working in my games and being a good value to me. I don't care if your grandma hates ATI :laugh:


Most people don't.... It is a sad state of affairs when people spew crap but rarely do any research. It also seems like that the 2900xt is blowing the crap outta the 8800gtx in bioshock which is UT3 engine. It also kills the theory of 600W PSU running a 2900xt as that has been debunked a while ago but people still continue to believe FUD.


 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Zstream
Originally posted by: cmdrdredd
Originally posted by: Zstream
Ok well here goes.

http://www.neoseeker.com/Artic...ews/ati_2900xt/10.html

http://techreport.com/articles.x/12458/15

http://www.pcstats.com/article...?articleid=2159&page=5


I think that more people leave the PC idle then playing games.

lol I never knew that. Anyway like i said, I don't care about power consumption. It's all about a card actually working in my games and being a good value to me. I don't care if your grandma hates ATI :laugh:


Most people don't.... It is a sad state of affairs when people spew crap but rarely do any research. It also seems like that the 2900xt is blowing the crap outta the 8800gtx in bioshock which is UT3 engine. It also kills the theory of 600W PSU running a 2900xt as that has been debunked a while ago but people still continue to believe FUD.

Also, another UT3 engine based game Medal of Honor Airborne runs better on the HD2900xt. If Bioshock didn't cripple ATI cards purposely (yes it's no question) then it would be more even in DX10 I believe as well.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: SickBeast
It said to use 10 cents per kilowatt hour, so I did. The rate varies. People in Hawaii could theoretically save close to $150 per year by using a 320MB 8800GTS.

How did you perform your calculations?

I used: 90 watts = 90 watt hours per hour the device is run. I multiplied that times 24 hours, then times 365 days. After that I multiplied the figure by .1 to reflect the price of electricity.

One of the errors I see in your formula is that you used 24 hours per day as the run time instead of 8 hours per week as stated in the study. 90 watts is the difference between the 8800GTX 320 and the X2900XT at full load. And since the average person games 8 hours per week, that's the only time there is a 90 watt difference between the two GPUs. When the two cards are at idle, there is only a 10 watt difference between the two.

So to break it down, the difference in power consumption between an 8800GTS 320 and an X2900XT is as follows:


90 watts difference x 8 hours per week of gaming = 720 watts per week

720 watts per week x 52 weeks per year = 37400 watts per year.

37400 watts per year / 1000 watts per kW = 37.44 kW per year

37.44 kW per year / 12 months per year = 3.12 kW per month

3.12 kW per month x $0.10 per kW = $0.312 or approximately 32 cents.


As a verification, you can go to that website you linked and enter:

Device/Wattage - 95 (52" ceiling fan, high) [95 watts is close enough to 90]
Amount used per day - 1 hr./day [Remember, the 90 watt difference is 8 hours per week which would be approx 1 hr./day]
Cost of Electric. - 10 cents
Days used per month - 31

Plug all those figures in and the answer comes out to be $0.30 per month which is exactly what I originally calculated it to be.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Creig
Originally posted by: SickBeast
It said to use 10 cents per kilowatt hour, so I did. The rate varies. People in Hawaii could theoretically save close to $150 per year by using a 320MB 8800GTS.

How did you perform your calculations?

I used: 90 watts = 90 watt hours per hour the device is run. I multiplied that times 24 hours, then times 365 days. After that I multiplied the figure by .1 to reflect the price of electricity.

One of the errors I see in your formula is that you used 24 hours per day as the run time instead of 8 hours per week as stated in the study. 90 watts is the difference between the 8800GTX 320 and the X2900XT at full load. And since the average person games 8 hours per week, that's the only time there is a 90 watt difference between the two GPUs. When the two cards are at idle, there is only a 10 watt difference between the two.

So to break it down, the difference in power consumption between an 8800GTS 320 and an X2900XT is as follows:


90 watts difference x 8 hours per week of gaming = 720 watts per week

720 watts per week x 52 weeks per year = 37400 watts per year.

37400 watts per year / 1000 watts per kW = 37.44 kW per year

37.44 kW per year / 12 months per year = 3.12 kW per month

3.12 kW per month x $0.10 per kW = $0.312 or approximately 32 cents.


As a verification, you can go to that website you linked and enter:

Device/Wattage - 95 (52" ceiling fan, high) [95 watts is close enough to 90]
Amount used per day - 1 hr./day [Remember, the 90 watt difference is 8 hours per week which would be approx 1 hr./day]
Cost of Electric. - 10 cents
Days used per month - 31

Plug all those figures in and the answer comes out to be $0.30 per month which is exactly what I originally calculated it to be.

my head hurts... :D
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?

I still want to know why you guys are suddenly playing down the importance of power consumption now, when in the past, it was a very big deal indeed. What has changed? Is electricity cheaper these days? Are PSU's a dime a dozen? What's going on here?

Power consumption will always be a factor in choosing a video card. Given two video cards that are equal in every aspect except power consumption, I would take the one that consumes less and I think most other people would, too.

But when Sickbeast tried to include the cost of the electricity as a legitimate reason to choose an 8800 series card over an X2900XT, that's when I chose to jump in and prove just how much of a non-factor it really was.

Besides which, the X2900XT is more of a direct competitor to the 8800GTS 640 than it is to the 320 that we've been comparing it to in this thread. If I were to redo my calculations and compare the 8800GTS 640 to the X2900XT, the cost difference per month would be even smaller as there is only a 65w difference between the two at full load. But I don't want to torture cmdrdredd with my equations any more than I already have so I'll leave that for another day. :D
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
What if oil went to $150/gallon and we had major flare up in the middle east? We might end up paying, um, say, 30cents/kilowatt hour, which starts creeping up closer to $1/month. For an hd 2900xt instead of an 8800gts 320. That would be crazy to pay that much more in that case, coke might go out of business since we'd all be giving one up EVERY SINGLE MONTH!!!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: bryanW1995
What if oil went to $150/gallon and we had major flare up in the middle east? We might end up paying, um, say, 30cents/kilowatt hour, which starts creeping up closer to $1/month. For an hd 2900xt instead of an 8800gts 320. That would be crazy to pay that much more in that case, coke might go out of business since we'd all be giving one up EVERY SINGLE MONTH!!!

What if pigs flew like birds?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Creig
Originally posted by: apoppin

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin

apoppin, you cannot take action against someone who is responding to general comments made by somebody who happens to also be a moderator. If the comments being made are in direct relation to actions/warnings given by a moderator, then that is another matter. But moderators are also general members of the AT forums and should not be given any more or less latitude than is extended to everybody else while they are making non-moderator related posts.

I do not see where in Zstream's post that he is calling out keysplayr2003 because he is a moderator or any action he took while acting in a moderating role. Rather, it appears that the "comeback" is directed toward Zstream's assertion that keysplayr2003 tries to bait people while posting in a non-moderator capacity.



edit - Good God, I'm starting to sound like a lawyer.

You are *not* to edit out moderator comments in a post.

You do also not see Zstream's *pattern* of calling out Keys. i do. And IF you have a problem with my comments, then post in Personal Forum Issues.

Graphics Moderator apoppin

Very well, your request has been granted. Please check the Personal Forum Issues for a newly created thread regarding this issue.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Originally posted by: cmdrdredd
Originally posted by: Zstream
Originally posted by: cmdrdredd
Originally posted by: Zstream
Ok well here goes.

http://www.neoseeker.com/Artic...ews/ati_2900xt/10.html

http://techreport.com/articles.x/12458/15

http://www.pcstats.com/article...?articleid=2159&page=5


I think that more people leave the PC idle then playing games.

lol I never knew that. Anyway like i said, I don't care about power consumption. It's all about a card actually working in my games and being a good value to me. I don't care if your grandma hates ATI :laugh:


Most people don't.... It is a sad state of affairs when people spew crap but rarely do any research. It also seems like that the 2900xt is blowing the crap outta the 8800gtx in bioshock which is UT3 engine. It also kills the theory of 600W PSU running a 2900xt as that has been debunked a while ago but people still continue to believe FUD.

Also, another UT3 engine based game Medal of Honor Airborne runs better on the HD2900xt. If Bioshock didn't cripple ATI cards purposely (yes it's no question) then it would be more even in DX10 I believe as well.


I don't think it was purposefully crippling the cards as much as just focusing on Nvidia optimization and ignoring ATI.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: jjzelinski
Crippled? Runs great on my Gecube X1950XT :)

It also runs great on ATI's HD2900XT but only really under DX9. I don't think it's all drivers. I honestly think that they have things made specifically for Nvidia that ATI cards just don't do the same. This could be why AA works in every game except this one with ATI's drivers. Once they work around it it will be ok though. I find the game to run fine under DX10.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
You know I was pretty underwhelmed by what DX10 has to offer in Bioshock, to the point of "who the hell care?" If BS was designed to showcase the DX10 strengths of nvidia's offerings then they were pretty misguided imho.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: jjzelinski
You know I was pretty underwhelmed by what DX10 has to offer in Bioshock, to the point of "who the hell care?" If BS was designed to showcase the DX10 strengths of nvidia's offerings then they were pretty misguided imho.

Well, there isn't much to their DX10 code. Really it just makes some particle effects more smooth. No big deal. Plus it feels like a port in the way the settings are. No advanced options like every other PC game has had for years.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.

Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.

Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.

That's what I tried to say. Also, I don't find the HD2900XT noisy or particularly hot inside my case. I'm sure it would vary from person to person.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: cmdrdredd
Originally posted by: BFG10K
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.

Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.

That's what I tried to say. Also, I don't find the HD2900XT noisy or particularly hot inside my case. I'm sure it would vary from person to person.
You guys have the right to your opinion(s).

IMO most people will use a new card like the 2900XT for at least a year, in which case they should pay at least $25 more in electricity costs. $25 is 8% of $300. Really, to me, it's pretty much the same thing as the card performing 8% worse in benchmarks. Why? Because you need to factor in the overall cost. For someone using the card for 2 years, it turns into 16%, then 24% for 3 years.

$2/month sounds petty until you add it up over longer durations and compare it to the cost of the card.

I'm glad I did the research and now have an idea how much my computer costs to run.

Saying that someone can't afford the card if they can't afford $2/month extra is silly. Even if someone CAN, they're not going to like having to pay more each month (and it SHOULD factor into a purchasing decision, albeit in a small way).

I can see why it might seem petty but I do have the right to my opinion. :D

BTW, factor in the heat and noise, and really this issue becomes far greater IMO.