- Mar 21, 2004
- 13,576
- 6
- 76
So AMDs hybrid CF was announced some monthes ago, but the were quite clear that the first generation of it will NOT support turning off your video card while in the desktop.
nVidia just announced their version of it, and theirs WILL...
AMD:
+ increase performance
+ multiple monitors
- No turning off GPU
nVidia:
+ increase performance
+ turn off GPU
- single monitor only.
Seems like AMD misses the mark yet again. I was going to switch to AMD for this technology, but with them not including it in first gen and nvidia coming up with a version that does I guess I will be buying nvidia. In fact I am currently waiting, I was itching to upgrade my video card, but I will hold off until this tech is out and buy whichever comes up with it first. (because I am tired of my 7900GS).
This does not bode well for the competitive market since AMD has once again had come up with something amazing and missed the mark on its implementation.
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...
I live in texas. Texas average cost is 14 cents + extra charges per KWh (if you use TXU, most people just haven't switched after the deregulation). I have a few tenth of a cent above the lowest in texas... coming at 12 cents per KWh (including all extra charges in that figure).
A reasonable reduction for a mid range card is 100 watts by turning off the card nad using the on-board GPU. (probably as high as 200 watts for the truly beastly cards)
100 watt drop, assuming your comp is on 24/7 and you use the video card for an average of 4 hours a day every day gives you 20 hours at 100 watt aka 2kwh a day. 730kwh a year, @12 cents per kwh thats 87.6$ a year...
If you are using a beastly card that takes 200 watt MORE then the onboard GPU on IDLE then you should save twice that... If you have two beastly cards then its four times the savings (350.4$ a year).
87.6$ minimum savings per year for a person who uses his computer 24/7 is the most tangible savings I have EVER seen from an energy efficient product.
Now not everyone leaves their computer on 24/7
I turn off my computer when not in use... on an average day I spend 8 hours doing general computing and 4 hours playing games. 8 hours of general computer where I could save 100 watts with a MID RANGE (not even a high end) card...
8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...
That is with no change whatsoever to the computer usage... all I would need to do for this is buy a compatible mobo and video card and install the driver.
So while this is not good justification for upgrading, it IS a good reason to choose what to update to.
nVidia just announced their version of it, and theirs WILL...
AMD:
+ increase performance
+ multiple monitors
- No turning off GPU
nVidia:
+ increase performance
+ turn off GPU
- single monitor only.
Seems like AMD misses the mark yet again. I was going to switch to AMD for this technology, but with them not including it in first gen and nvidia coming up with a version that does I guess I will be buying nvidia. In fact I am currently waiting, I was itching to upgrade my video card, but I will hold off until this tech is out and buy whichever comes up with it first. (because I am tired of my 7900GS).
This does not bode well for the competitive market since AMD has once again had come up with something amazing and missed the mark on its implementation.
EDIT:
People keep on arguing that power reduction is only useful in laptops so let me use some MATH to back up what I say...
I live in texas. Texas average cost is 14 cents + extra charges per KWh (if you use TXU, most people just haven't switched after the deregulation). I have a few tenth of a cent above the lowest in texas... coming at 12 cents per KWh (including all extra charges in that figure).
A reasonable reduction for a mid range card is 100 watts by turning off the card nad using the on-board GPU. (probably as high as 200 watts for the truly beastly cards)
100 watt drop, assuming your comp is on 24/7 and you use the video card for an average of 4 hours a day every day gives you 20 hours at 100 watt aka 2kwh a day. 730kwh a year, @12 cents per kwh thats 87.6$ a year...
If you are using a beastly card that takes 200 watt MORE then the onboard GPU on IDLE then you should save twice that... If you have two beastly cards then its four times the savings (350.4$ a year).
87.6$ minimum savings per year for a person who uses his computer 24/7 is the most tangible savings I have EVER seen from an energy efficient product.
Now not everyone leaves their computer on 24/7
I turn off my computer when not in use... on an average day I spend 8 hours doing general computing and 4 hours playing games. 8 hours of general computer where I could save 100 watts with a MID RANGE (not even a high end) card...
8 hours of 100 watts saved = 0.8 KWh power reduction per day... 292 KWh per year... multiple by the low low price of 12 cents per KWh (remember, lowest in texas!) and it comes out to be 35.04$ a year.
multiply by texas average price of 15 cents and you get 43.8$...
That is with no change whatsoever to the computer usage... all I would need to do for this is buy a compatible mobo and video card and install the driver.
So while this is not good justification for upgrading, it IS a good reason to choose what to update to.