Originally posted by: dadach
nice to know the calculation, but it shouldnt really be any factor at all, in deciding what gfx card one should buy
Originally posted by: dadach
nice to know the calculation, but it shouldnt really be any factor at all, in deciding what gfx card one should buy
Originally posted by: taltamir
Originally posted by: pmv
Electricity prices are probably going to go up in the near future, also.
Personally I live in an underheated house in a not particularly warm country with expensive electricty, so no AC, and the computer serves in lieu of having the heating on during winter, as the heating is horribly expensive.
I suppose one could offset the cost of running the computer against the cost of having the TV on, as personally I find one tends to substitute for the other.
in texas, the minimum price for electricity went up from 11 cents per KWH to 14 cents per KWH in the past year. There was a power company here that had contracts with people to provide a set price at 11 cents per KWH (i was using it actually), and it went out of business.
I am paying now 15.3 per KWH (I add another 1.3 per KWH to make it 100% renewable. Forget skimping and savings. zero pollution, and the more I waste, the more i subsidize the renewable industry... I still like savings because I don't like tossing money away).
Originally posted by: Candymancan
Just replace all your lightbulbs in the house with energy efficient ones to offset the cost.
Or just stop eating Mcdonalds for a whole year and you'll save money and lose weight at the same time lol
Originally posted by: The Odorous One
Myself, well, I build machines for performance. I don't use any of that energy saving crap on my desktops (bios, OS, or other all disabled). That is not to say I am an energy waster, quite the contrary. I simply shut down my machines when I don't game or need the excessive crunching. I use my low powered laptop for nearly everything.
Originally posted by: pmv
Originally posted by: dadach
nice to know the calculation, but it shouldnt really be any factor at all, in deciding what gfx card one should buy
Well, it could be, depending on what one wants the card for.
I mean, if two cards offer similar performance (or one card vs two cards in sli), and one is $20 cheaper, but over a year is likely to use $40 more electricity, why not take it into consideration? I think PCs and gfx card especially have now gotten to the point where the power use is becoming a non negligable issue. Of course Americans have cheaper power costs than us Euroweenies.
Originally posted by: taltamir
Originally posted by: The Odorous One
Myself, well, I build machines for performance. I don't use any of that energy saving crap on my desktops (bios, OS, or other all disabled). That is not to say I am an energy waster, quite the contrary. I simply shut down my machines when I don't game or need the excessive crunching. I use my low powered laptop for nearly everything.
Unless you are using quad core extreme with triple SLI GTX280 then that statement is silly.
Money comes into play some time. I am not saying "save energy to save the planet"
That is BS. Saving 10% energy just means you pollute 10% less, if you want to not pollute, switch to a 100% renewable plan, it will cost you 1.3 cents more per KWH (at least in texas, over the 14 cents per kwh base price), and you create 0 pollution AND the more you spend the more you subsidize the technology...
The point of this calculations is price value for performance.
Originally posted by: nZone
This is not a full cost calculation of electricity consumption. The calculation here is just a basic service. There are also delivery charges (in Massachusetts).
Rates for Delivery Service
Customer Charge $6.21/month
Distribution Charge 2.660¢/kWh
Transmission Charge 1.037¢/kWh
Transition Charge 0.233¢/kWh
Demand Side Management Charge 0.250¢/kWh
Renewables Charge 0.050¢/kWh
Originally posted by: The Odorous One
Myself, well, I build machines for performance. I don't use any of that energy saving crap on my desktops (bios, OS, or other all disabled). That is not to say I am an energy waster, quite the contrary. I simply shut down my machines when I don't game or need the excessive crunching. I use my low powered laptop for nearly everything.
Which standby problem? I have both the ga-965g and ga-ma78gm which, according to a kill-a-watt, reduces power usage to ~four watts each in s3 and, with vista32, comes out smoothly and impressively quickly with a keypress or mouse move.Originally posted by: VirtualLarry
...
I'm the same way. I disable all power-saving crap, since it usually never works right anyways. (And my assumptions turned out to be right, look up the Gigabyte S3 standby issue.)
...
Originally posted by: Spartan Niner
And the underwhelming conclusion/summary of this thread is:
One graphics card is generally more efficient than two+. Mind boggling, isn't it?
That said I underclock my ATI video card significantly, so at idle with a low-volted C2D I'm using very little power...
Originally posted by: seemingly random
Which standby problem? I have both the ga-965g and ga-ma78gm which, according to a kill-a-watt, reduces power usage to ~four watts each in s3 and, with vista32, comes out smoothly and impressively quickly with a keypress or mouse move.Originally posted by: VirtualLarry
...
I'm the same way. I disable all power-saving crap, since it usually never works right anyways. (And my assumptions turned out to be right, look up the Gigabyte S3 standby issue.)
...
Originally posted by: VirtualLarry
At least on their 775 mobos, if you overclock the FSB to something over 300Mhz, they have problems resuming from standby. There are threads in the motherboards forum about this issue. It's definately a design flaw, probably in the BIOS, and not just a batch of bad units.