- Dec 16, 2003
- 927
- 1
- 81
Here's a Quad Power Consumption Graph done by X-bit which shows how much more efficient 45nm is compared to 65nm Quads.
This bodes well for all of the 45nm CPU's.
This bodes well for all of the 45nm CPU's.
Originally posted by: aigomorla
Now how are you people suposed to cool 300W of heat on air?
Originally posted by: Amaroque
I don't overvolt my CPU's because I use them for 24/7 365 operation.
Originally posted by: Amaroque
I don't overvolt my CPU's because I use them for 24/7 365 operation.
mostly fud. there was an off-topic discussion hereOriginally posted by: Amaroque
I know that this is off the topic, but what is wrong with Lasik wwswimming? I had Lasik done several years ago, and I couldn't be more pleased. I have better than 20/20 vision...
Originally posted by: Idontcare
Originally posted by: Amaroque
I don't overvolt my CPU's because I use them for 24/7 365 operation.
If you don't overvolt then you are likely to be good for 24/7 365x10 operation (i.e. 10 years or longer continuous operation).
For those of us who have no intention to be running today's relics in 3 years time (where will you be in 2011?) we can live with 24/7 365x3 operation with some elevated Vcore along the way.
Originally posted by: aigomoria
Wow.. wonder what that works out to over a year in power costs.. I leave my tower on 24/7 OUCH!
We pay $0.07/kWh for electricity. For a ~125W computer idling 24/7 in my area (my computer idles at about 115W measured at the wall), one would pay $0.21/day which is $6.30 per month. For a quad-core CPU running all four cores on something like Folding@Home 24/7 in a a place like southern California where rates are more like $0.13/kWh, one could expect to pay something closer to $18 per month.Wow.. wonder what that works out to over a year in power costs.. I leave my tower on 24/7 OUCH!
I know that this is off the topic, but what is wrong with Lasik wwswimming?
Originally posted by: pm
We pay $0.07/kWh for electricity. For a ~125W computer idling 24/7 in my area (my computer idles at about 115W measured at the wall), one would pay $0.42/day which is $6.30 per month.Wow.. wonder what that works out to over a year in power costs.. I leave my tower on 24/7 OUCH!
Originally posted by: Yoxxy
I have been running my QX9650 + 8800 ultra and see almost 450w from the wall while gaming.
Overvolting these chips increases usage rapidly.
Originally posted by: Yoxxy
I wasn't complaining.
Originally posted by: Engineer
Originally posted by: pm
We pay $0.07/kWh for electricity. For a ~125W computer idling 24/7 in my area (my computer idles at about 115W measured at the wall), one would pay $0.42/day which is $6.30 per month.Wow.. wonder what that works out to over a year in power costs.. I leave my tower on 24/7 OUCH!
125W would take 8 hours to equal a killowatt. You would get 3 kWh throughout the day so you would pay $0.21 per day, not $0.42 per day???
Is my math fuzzy or yours?
I have been running my QX9650 + 8800 ultra and see almost 450w from the wall while gaming.
Let's round it up tp 500W...or 0.5 KWh...still is only costing you about 10 pennies an hour while you play even if you live in HW and pay $0.20/KWh.