Originally posted by: Creig
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!
You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...
Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.
The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.