2900XT close in price to the 8800GTS...

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: SickBeast
Originally posted by: cmdrdredd
Originally posted by: BFG10K
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.

Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.

That's what I tried to say. Also, I don't find the HD2900XT noisy or particularly hot inside my case. I'm sure it would vary from person to person.
You guys have the right to your opinion(s).

IMO most people will use a new card like the 2900XT for at least a year, in which case they should pay at least $25 more in electricity costs. $25 is 8% of $300. Really, to me, it's pretty much the same thing as the card performing 8% worse in benchmarks. Why? Because you need to factor in the overall cost. For someone using the card for 2 years, it turns into 16%, then 24% for 3 years.

$2/month sounds petty until you add it up over longer durations and compare it to the cost of the card.

I'm glad I did the research and now have an idea how much my computer costs to run.

Saying that someone can't afford the card if they can't afford $2/month extra is silly. Even if someone CAN, they're not going to like having to pay more each month (and it SHOULD factor into a purchasing decision, albeit in a small way).

I can see why it might seem petty but I do have the right to my opinion. :D

BTW, factor in the heat and noise, and really this issue becomes far greater IMO.

1st I thought Anandtech Video reviews are rubish so i never quote them :! I quest their testing Quality. Even Gamespot does better review which is embrassing. Computerbase , Beyond3D , firingsquad , techreport and some Xbit-lab article.

2900XT is no where as noisy as the X1800XT/x1900xt or 5800U. i have the 2900XT and 8800GTX. 8800GTX is quieter but 2900XT never really goes 100% in game. The noise isn't when playing games or doing normal desktop task. Maybe it will start to run at 100% when the summer hits Australia (40*C).

2nd Power consumption is a valid argument. The 2900XT does consume craploads of power.

3rd The 2900XT is showings it can easily beat up a 8800GTS and even some case beat up a 8800GTX. Like in Bioshock DX9 XP mode where the 2900XT woops 8800GTX by 3FPS in all high at high res.

Alot of you want to bash the 2900XT for its performance when you don't reliease that most games that get shipped don't get tested on 2900XT. Bioshock went gold with major bug stopper , then ATI had to release a hot fix. Bioshock still has major issue in performance when running Bioshock in Vista DX10 mode. Also if you seen few new benchmark , 2900xt does extremely good in opengl games like Prey , Quake 4.

Company of heroes , World in Conflict , Bioshock and other TWIMTBP games all went gold without really testing it on ATI product. This thing does happen other dev team like Crysis , valve , ID and epic which make sure that game works on both companies GPU without a major performance issue or ship with bug stopper. but title of the majority developer that are in the nvidia program take the easy route which they let Nvidia fly down their engineers for a few days to tweak the games for them.


Bioshock
Avivo hp vs pure video hp
Much better article than Anandtech version.


Good thing about ATI 2900XT thats its true successor wouldn't come out untill Q3 2008 and next update wouldn't come untill Q1 2008 so ati will be intensively tweaking the driver , fixing the TWITTBP game bug/performances and in the end G80 will get the same treatment as the N60 ,G70 and totally forgotten 7900GX2, that treatment is that they will make sure the game will work but wouldn't bother with fixing the performance. It happened with 7900GT which gets easily beaten by x1900xt 256mb because ATI had to keep tweaking the driver because of 2900XT delay. Look at comparison of 6800GT vs X800XL || 1900xt vs 7900gt in newer games.

ATI driver team have showed me that they are way more efficient and better than Nvidia driver team.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: SickBeast
Originally posted by: cmdrdredd
Originally posted by: BFG10K
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.

Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.

That's what I tried to say. Also, I don't find the HD2900XT noisy or particularly hot inside my case. I'm sure it would vary from person to person.
You guys have the right to your opinion(s).

IMO most people will use a new card like the 2900XT for at least a year, in which case they should pay at least $25 more in electricity costs. $25 is 8% of $300. Really, to me, it's pretty much the same thing as the card performing 8% worse in benchmarks. Why? Because you need to factor in the overall cost. For someone using the card for 2 years, it turns into 16%, then 24% for 3 years.

$2/month sounds petty until you add it up over longer durations and compare it to the cost of the card.

I'm glad I did the research and now have an idea how much my computer costs to run.

Saying that someone can't afford the card if they can't afford $2/month extra is silly. Even if someone CAN, they're not going to like having to pay more each month (and it SHOULD factor into a purchasing decision, albeit in a small way).

I can see why it might seem petty but I do have the right to my opinion. :D

BTW, factor in the heat and noise, and really this issue becomes far greater IMO.

Sickbeast, you are NOT saving $25 a year more in electricity by choosing an 8800GTS 320 over an X2900XT if you are the average gamer. You will only be saving $0.32 per month which is $3.84 per year.

Originally posted by: SickBeast
I used: 90 watts = 90 watt hours per hour the device is run. I multiplied that times 24 hours, then times 365 days. After that I multiplied the figure by .1 to reflect the price of electricity.

Originally posted by: SickBeast
Actually I just did the *proper* math, and you save $78.84 per year on electricity by using 90 watts less, if the device is on all the time. Even if it's only on 8 hours a day, you're saving over $25 anually, which does add up.


The average person plays games 8 hours per week, not 8 hours per day. Even if the computer is ON 8 hours per day, the only time there is a 90 watt difference between the 8800GTS 320 and the X2900XT is at FULL LOAD (ie - playing games). When the 8800GTS 320 and the X2900XT are at idle, there is only a 10 watt difference between the two.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Creig
The average person plays games 8 hours per week, not 8 hours per day. Even if the computer is ON 8 hours per day, the only time there is a 90 watt difference between the 8800GTS 320 and the X2900XT is at FULL LOAD (ie - playing games). When the 8800GTS 320 and the X2900XT are at idle, there is only a 10 watt difference between the two.
Would you deny me my Coca-Cola? In fact, over time I could definately buy some beer with that money. :beer::beer::beer:

I do see your point, however, in that $4/year is insignificant.

I'm tired of beating this dead horse at this point. GL OP with your choice. :beer: