tuteja1986
Diamond Member
- Jun 1, 2005
- 3,676
- 0
- 0
Originally posted by: SickBeast
You guys have the right to your opinion(s).Originally posted by: cmdrdredd
Originally posted by: BFG10K
Arguing power consumption in the context of an extra $2 a month is really quite ridiculous. If $2 makes all the difference to someone's power bill then they likely can't afford either card (or even a computer for that matter) to begin with.
Heat? Sure. Noise? Absolutely. But not $2 extra on a power bill; that's just silly.
That's what I tried to say. Also, I don't find the HD2900XT noisy or particularly hot inside my case. I'm sure it would vary from person to person.
IMO most people will use a new card like the 2900XT for at least a year, in which case they should pay at least $25 more in electricity costs. $25 is 8% of $300. Really, to me, it's pretty much the same thing as the card performing 8% worse in benchmarks. Why? Because you need to factor in the overall cost. For someone using the card for 2 years, it turns into 16%, then 24% for 3 years.
$2/month sounds petty until you add it up over longer durations and compare it to the cost of the card.
I'm glad I did the research and now have an idea how much my computer costs to run.
Saying that someone can't afford the card if they can't afford $2/month extra is silly. Even if someone CAN, they're not going to like having to pay more each month (and it SHOULD factor into a purchasing decision, albeit in a small way).
I can see why it might seem petty but I do have the right to my opinion.
BTW, factor in the heat and noise, and really this issue becomes far greater IMO.
1st I thought Anandtech Video reviews are rubish so i never quote them :! I quest their testing Quality. Even Gamespot does better review which is embrassing. Computerbase , Beyond3D , firingsquad , techreport and some Xbit-lab article.
2900XT is no where as noisy as the X1800XT/x1900xt or 5800U. i have the 2900XT and 8800GTX. 8800GTX is quieter but 2900XT never really goes 100% in game. The noise isn't when playing games or doing normal desktop task. Maybe it will start to run at 100% when the summer hits Australia (40*C).
2nd Power consumption is a valid argument. The 2900XT does consume craploads of power.
3rd The 2900XT is showings it can easily beat up a 8800GTS and even some case beat up a 8800GTX. Like in Bioshock DX9 XP mode where the 2900XT woops 8800GTX by 3FPS in all high at high res.
Alot of you want to bash the 2900XT for its performance when you don't reliease that most games that get shipped don't get tested on 2900XT. Bioshock went gold with major bug stopper , then ATI had to release a hot fix. Bioshock still has major issue in performance when running Bioshock in Vista DX10 mode. Also if you seen few new benchmark , 2900xt does extremely good in opengl games like Prey , Quake 4.
Company of heroes , World in Conflict , Bioshock and other TWIMTBP games all went gold without really testing it on ATI product. This thing does happen other dev team like Crysis , valve , ID and epic which make sure that game works on both companies GPU without a major performance issue or ship with bug stopper. but title of the majority developer that are in the nvidia program take the easy route which they let Nvidia fly down their engineers for a few days to tweak the games for them.
Bioshock
Avivo hp vs pure video hp
Much better article than Anandtech version.
Good thing about ATI 2900XT thats its true successor wouldn't come out untill Q3 2008 and next update wouldn't come untill Q1 2008 so ati will be intensively tweaking the driver , fixing the TWITTBP game bug/performances and in the end G80 will get the same treatment as the N60 ,G70 and totally forgotten 7900GX2, that treatment is that they will make sure the game will work but wouldn't bother with fixing the performance. It happened with 7900GT which gets easily beaten by x1900xt 256mb because ATI had to keep tweaking the driver because of 2900XT delay. Look at comparison of 6800GT vs X800XL || 1900xt vs 7900gt in newer games.
ATI driver team have showed me that they are way more efficient and better than Nvidia driver team.