• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Xbitlabs review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Pabster
Originally posted by: ronnn
I certainly am looking to see power consumption at idle and load. I am assuming that amd now takes the power pig award from nvidia, but until the idle figures are in - as a jury I am out.

Worrying about power consumption on high-end graphics card is a moot point. If you're dropping $400+ I think a few kilowatts of electricity isn't a concern.

Heat and noise, however, are a concern. And not coincidentally, they increase proportionately to power consumption.

It's hard to justify that power consumption when your competitor has a product at 2/3 the consumption at equal performance as well as image quality. If it has more power consumption I do expect something in return for having to put up with that.

Power consumption may not be a concern for you, but for some people it is.
 
Originally posted by: coldpower27
It's hard to justify that power consumption when your competitor has a product at 2/3 the consumption at equal performance as well as image quality. If it has more power consumption I do expect something in return for having to put up with that.

Power consumption may not be a concern for you, but for some people it is.

You're right. I don't care about power consumption. :laugh:

But I do care about noise. And heat. The less of both, the better. And let me tell ya, the 8800 series are no champion in either department, and (obviously) neither is R600.
 
Originally posted by: Pabster
Originally posted by: coldpower27
It's hard to justify that power consumption when your competitor has a product at 2/3 the consumption at equal performance as well as image quality. If it has more power consumption I do expect something in return for having to put up with that.

Power consumption may not be a concern for you, but for some people it is.

You're right. I don't care about power consumption. :laugh:

But I do care about noise. And heat. The less of both, the better. And let me tell ya, the 8800 series are no champion in either department, and (obviously) neither is R600.

True, but according to the results of the Xbitlabs article, the 8800's are superior in these 2 arenas for their respective weight class. I don't think anyone is expecting passive 7600 GS level noise or power consumption at this level. 😉
 
No, that isn't what I'm saying. What I'm saying is if a few dollars a month makes a difference then how can someone drop down $300 for a video card?

Just curious, would it really be a few dollars a month? As an estimate would it be loser to 1 or closer to 5?
 
Originally posted by: golem
No, that isn't what I'm saying. What I'm saying is if a few dollars a month makes a difference then how can someone drop down $300 for a video card?

Just curious, would it really be a few dollars a month? As an estimate would it be loser to 1 or closer to 5?


vs gts it would be 50-90w more

okay so 2 hours a day (full load 3d game) would be 180watt hours per day x 30 days (lets say you game a lot) 3600 watt hours a month x 14.49 cents per kilow watt hour (californias rate, national average is like 11 cents ) = 52 cents extra per month, $6.24 per year.

add in fan or a/c for the extra heat output, then you'll have your cost to run the card.

if you want to be extra geeky and play 5 hours a day every day, you would spend about $2 a month extra
 
Originally posted by: golem
No, that isn't what I'm saying. What I'm saying is if a few dollars a month makes a difference then how can someone drop down $300 for a video card?

Just curious, would it really be a few dollars a month? As an estimate would it be loser to 1 or closer to 5?

In Texas, probably closer to 5 😉 We get raped @ 15+/kwh here 🙁 It's so bad that I try to use my notebook instead of my desktop for almost everything.
 
Swtethan and Arkaign,

Thanks for the info. At that rate, then I agree, power draw is not really relevant unless you need to buy a beefier power supply.
 
Originally posted by: swtethan
Originally posted by: golem
No, that isn't what I'm saying. What I'm saying is if a few dollars a month makes a difference then how can someone drop down $300 for a video card?

Just curious, would it really be a few dollars a month? As an estimate would it be loser to 1 or closer to 5?


vs gts it would be 50-90w more

okay so 2 hours a day (full load 3d game) would be 180watt hours per day x 30 days (lets say you game a lot) 3600 watt hours a month x 14.49 cents per kilow watt hour (californias rate, national average is like 11 cents ) = 52 cents extra per month, $6.24 per year.

add in fan or a/c for the extra heat output, then you'll have your cost to run the card.

if you want to be extra geeky and play 5 hours a day every day, you would spend about $2 a month extra

thanks ... i didn't really think anyone would actually calculate it
[same state too 🙂]

even 'on' the grid ... i can afford $25 a year ... i will drive a smaller car ... and plant a couple of extra trees
 
Originally posted by: swtethan
Originally posted by: golem
No, that isn't what I'm saying. What I'm saying is if a few dollars a month makes a difference then how can someone drop down $300 for a video card?

Just curious, would it really be a few dollars a month? As an estimate would it be loser to 1 or closer to 5?


vs gts it would be 50-90w more

okay so 2 hours a day (full load 3d game) would be 180watt hours per day x 30 days (lets say you game a lot) 3600 watt hours a month x 14.49 cents per kilow watt hour (californias rate, national average is like 11 cents ) = 52 cents extra per month, $6.24 per year.

add in fan or a/c for the extra heat output, then you'll have your cost to run the card.

if you want to be extra geeky and play 5 hours a day every day, you would spend about $2 a month extra


Those kind of figures explain why alot of people here aren't bothered by the wattage - but as I live in the UK I have to live with pricier bills! I'm pretty sure I'd be paying a large amount extra per year, enough to take into consideration when looking at the initial cost of the card.
 
Originally posted by: munky
Originally posted by: yacoub
power draw is more relevant when it comes to whether or not he needs to upgrade his PSU than electrical cost

Who in their right mind would build and run a high end system with a crappy PSU?

Lots of people never think to need to upgrade their PSU. And then they see that the PSU costs as much as the CPU and they are like wow wtf.
 
Originally posted by: munky
Who in their right mind would build and run a high end system with a crappy PSU?

LOL. Take a look around here and see how many people are running crappy PSUs. :laugh:

The fact is most people want to put the money in to their video card or CPU or whatever, they don't consider the power supply to be a critical component.
 
Originally posted by: coldpower27
Originally posted by: Pabster
Originally posted by: ronnn
I certainly am looking to see power consumption at idle and load. I am assuming that amd now takes the power pig award from nvidia, but until the idle figures are in - as a jury I am out.

Worrying about power consumption on high-end graphics card is a moot point. If you're dropping $400+ I think a few kilowatts of electricity isn't a concern.

Heat and noise, however, are a concern. And not coincidentally, they increase proportionately to power consumption.

It's hard to justify that power consumption when your competitor has a product at 2/3 the consumption at equal performance as well as image quality. If it has more power consumption I do expect something in return for having to put up with that.

Power consumption may not be a concern for you, but for some people it is.

Yeah, like it happened when the X1900 debuted, it uses more power than a X1800 but has far more shader power, when it's compared against the 7900 series which uses less power, the X1900 higher power consumption is spent on shader power which outperforms the 7900 in most scenarios.
 
Originally posted by: evolucion8
Originally posted by: coldpower27
Originally posted by: Pabster
Originally posted by: ronnn
I certainly am looking to see power consumption at idle and load. I am assuming that amd now takes the power pig award from nvidia, but until the idle figures are in - as a jury I am out.

Worrying about power consumption on high-end graphics card is a moot point. If you're dropping $400+ I think a few kilowatts of electricity isn't a concern.

Heat and noise, however, are a concern. And not coincidentally, they increase proportionately to power consumption.

It's hard to justify that power consumption when your competitor has a product at 2/3 the consumption at equal performance as well as image quality. If it has more power consumption I do expect something in return for having to put up with that.

Power consumption may not be a concern for you, but for some people it is.

Yeah, like it happened when the X1900 debuted, it uses more power than a X1800 but has far more shader power, when it's compared against the 7900 series which uses less power, the X1900 higher power consumption is spent on shader power which outperforms the 7900 in most scenarios.

Except it's not out performing the cards it's trying to compete against, it's on par with the GTS640/320 and uses more power.
 
Originally posted by: dreddfunk
Maybe careful savings? It may be hard to imagine that a lot of people fall into this category (willing to buy a $300 GPU and also worried about an extra $5/month) but I'm sure that there are a lot of gaming enthusiasts living on a budget out there.

In any event, initial price and continued-cost-to-own are two separate issues. Everyone should think about TCO issues when buying something, even if just in passing.

More like they're still living in their parents basement's and they'll get cut off if the power bill is too high!

But seriously... as said above.. the power draw is pretty marginal (especially when I'm sure they're not ALWAYS in 3D mode.... unless they really are in that basement!)
 
Power consumption is overrated, most modern gaming systems get by with 500watt PS anyways. Running quadcore, quadcpu setup, 50 fans in case you should worry about mental health more than power supply problems.

Electric bill change is moot point to, its not even noticeable to the average user.
 
Originally posted by: imaheadcase
Power consumption is overrated, most modern gaming systems get by with 500watt PS anyways. Running quadcore, quadcpu setup, 50 fans in case you should worry about mental health more than power supply problems.

Electric bill change is moot point to, its not even noticeable to the average user.

Newer gaming systems won't be able to pull off the 500w psu with the way these new video cards are sucking down wattage.
 
Back
Top