I stumbled on this thread and decided to do some super accurate testing (/sarcasm) with my Killl-a-Watt. I run two monitors, 16x10 and 12x10. The second monitor I only use for watching a movie or tv and is usually turned off. Note that turning the 2nd monitor off does not reduce GPU power draw, the monitor connection to the video card has to be unplugged. I also tried running the 16x10 monitor at 12x10 (with no GPU scaling), but this still resulted in 450/1250 clocks.
1090t, 6950 2gb, 8GB, 890GX, everything at stock:
Two monitors:
---
At 3.2GHz, total system draw was 137 watts idle. GPU ran at 450/1250
At 800MHz (CnQ), total draw was 132 watts. GPU at 450/1250
1 monitor:
---
At 3.2GHz, total system draw was 104 watts idle. GPU ran at 250/150
At 800MHz (CnQ), total draw was 99 watts. GPU at 250/150
So, going from 1.3v Vcore to 1.225Vcore (CnQ) is worth about 5 watts. As another aside, running an x4 instead of an x6 would be worth another 15-20 watts saving with CnQ since Vcore will drop to 1.0v on Phenom II quads.
Going from 2 monitors connected (with a different resolution) to 1 is worth about 33 watts. Another way to look at it is going from 450/1250 to 250/150 is worth 33 watts. At 24/7/365 usage and $0.12 kw/hr, these 33 extra watts cost roughly $35 a year. If we factor in s3 sleep and any time not running at idle clocks (since those times the wattage would be the same regardless) to be 50%, then it translates into $17 a year.
The reason I was interested in this is as follows.
As some point last year, something changed either with ATI's UVD drivers or with Microsoft's media foundation codecs/media center. I used to be able to play a game and watch tv simultaneously . Something changed, and now when trying to do those activities together my (at the time) 5850 would run at 400MHZ UVD clocks instead of 725MHz 3D clocks, making both the game and tv stutter horribly. My 5770 and 6950 exhibit this same behavior, as I assume all AMD cards do.
Since I don't do eyefinity gaming, this thread got me wondering about adding a second video card solely to power the second monitor. This should allow both cards to run at the lowest idle clocks for a net savings of about 20w if we assume about 10w idle for low-end 5 series cards. This should also allow full 3D clocks on the main card and UVD clocks on the 2nd card. I suppose I could try using the IGP too, but I worry about OC potential with the IGP enabled. From purely a cost savings standpoint, it might take around 2.5 years to recoup the cost of a 5450 via reduced electricity costs. Arguably not worth it, but two cards will allow for proper 3d clocks with UVD clocks on the second monitor/card.
Edit
Main monitor on 6950, 2nd on 4290 IGP:
---
At 3.2GHz, total system draw was 105 watts idle. 6950 at 450/1250, IGP at ~483/666
At 800MHz (CnQ), total draw was 100 watts. 6950 at 450/1250, IGP at ~483/666
Will run my current OC and update with any stability issues.