Dual monitor idle power use of ATI/Nvidia cards

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gorobei

Diamond Member
Jan 7, 2007
4,025
1,525
136
It seems that AMD does not have the same feature as Nvidia has, where if all monitors use same resolution & refresh rate the idling happens correctly. With the 6970, no matter what multimonitor config I try, the memory clock never drops at all and core clock only goes down to 500MHz (as opposed to 250MHz core, 150MHz mem with just single monitor attached). This happens even if all monitors are using DisplayPort, although I have only one monitor which supports DisplayPort natively, the other I tried with a displayport->HDMI adapter... Would be interesting to know if having native displayport on all monitors could possibly fix the issue.
that isnt the same as having all monitors using displayport. the dp->hdmi adapter is just passing the signal requirements to the card. The hdmi port is still using tmds while the dp is on the native bus timing. So no FTR blank sync. You would need 2 native dp monitors to test it properly.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Currently I use two monitors, but one is plugged into my discreet card and the other into the IGP. I only hear the fan on my discreet card spin up when I go into a game so I assume both GPUs remain idle even when both monitors are active but in 2D mode. I wish I had a kill-o-watts, then I could actually test it and see what's really happening.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
WelshBloke's thread looked at ways of adjusting clocks for power saving with multimonitor with AMD's 6xxx cards.

http://forums.anandtech.com/showthread.php?t=2131211&highlight=clock

I have 6950 and three identical Samsung 1080p monitors with DVI/DP/VGA. Activating a second monitor (with same connection, same res/refresh) causes the clocks to jump to an intermediate state: default 450/1250 (not UVD 500/1250, or any value in the BIOS). Tweaking clocks shows the 1250 is linked to full-3D memory clock, and 450 is a function of the same value - e.g. dropping full-3D memory clock to 800 gets 300/800 multimonitor clocks, and down to 750 gets 250/750 multimonitor clocks.
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
that isnt the same as having all monitors using displayport. the dp->hdmi adapter is just passing the signal requirements to the card. The hdmi port is still using tmds while the dp is on the native bus timing. So no FTR blank sync. You would need 2 native dp monitors to test it properly.

Yes, I figured it may be still the same problem. However, it would be good to know if using proper displayport monitors will solve the issue. I wouldn't want to buy new monitors just to find out it wont help :)

WelshBloke's thread looked at ways of adjusting clocks for power saving with multimonitor with AMD's 6xxx cards.

http://forums.anandtech.com/showthread.php?t=2131211&highlight=clock

Thanks, this seems to help at least a bit. Althrough having to manually "whitelist" certain 3D acclerated applications is a bit annoying but at least it allows to drop the idle power use to more reasonable levels. With this method the secondary monitors seem to quickly flicker as well when the clock switches, but it is something I am totally able to live with. Too bad it cant be enforced in the driver if that momentary flicker is the reason the clocks dont drop automatically as it is quite a minor issue compared to the increased power use.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Like Ive said in other threads, my GTX470 idles at 51/101/135MHz (core/shader/memory) after tinkering around with res/refresh rate. Since I use two identical monitors, it was easy to get the timings the same, and voila~ I dont need a secondary card anymore (which initially provided a bandaid solution).

So this does not work for AMD cards?
 

xanaqq

Junior Member
Jul 9, 2011
1
0
0
Yes, I figured it may be still the same problem. However, it would be good to know if using proper displayport monitors will solve the issue. I wouldn't want to buy new monitors just to find out it wont help :)



Thanks, this seems to help at least a bit. Althrough having to manually "whitelist" certain 3D acclerated applications is a bit annoying but at least it allows to drop the idle power use to more reasonable levels. With this method the secondary monitors seem to quickly flicker as well when the clock switches, but it is something I am totally able to live with. Too bad it cant be enforced in the driver if that momentary flicker is the reason the clocks dont drop automatically as it is quite a minor issue compared to the increased power use.

Have been looking for this information for the last 2 weeks! I have recently tested dual monitor over dvi with a 6770. Can confirm that the memory clock does not downclock at idle. I have a GTX 460 at the moment which downclocks while running a dual monitor set up. So I guess for people like me wanting to use two monitors while not wanting the extra electricity bill and heat will have to stick with Nvidia for the time being.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
So the question is, why use dual when you can use Tri? Eyefinity AND real estate for your pleasure/work.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
blurayanddualmonitor.jpg

June 22, 2012

I believe in Crossfire, the ZeroCore Power function also doesn't work (i.e, the 2nd GPU won't completely turn off its fan when in 2D mode) when using multiple monitors.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
blurayanddualmonitor.jpg

June 22, 2012

I believe in Crossfire, the ZeroCore Power function also doesn't work (i.e, the 2nd GPU won't completely turn off its fan when in 2D mode) when using multiple monitors.

Yikes, the 6870 is not on the more efficient side with 2 monitors. I wonder if the reviewer didn't downclock because the default dual monitor clocks are ridiculously high. I'm really curious to see triple monitor power consumption, but I can't find a good graph.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why is the 7970 using more Watts VS the 7970GHz Edition? Is it some kind of optimizations in the 7970GHz bios or Core?

Possible explanation is that GE cards have lower stock voltage (1.162V vs. 1.175V for original 7970s). Not sure if that's enough for another 21W of power reduction. Where GE falls apart is load 3D voltage (the original stays at 1.175V, but the GE cards go to 1.218V-1.256V :thumbsdown: ). Looks like GTX 680 2GB is the winner for multi-monitor idle power use (but ironically it doesn't have either the GPU power nor the VRAM sufficient to actually game with 3 them).
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I use triple monitor when I need the extra productivity for work or photo/file manipulation, but when I'm just browsing on the middle monitor, I use Win+P to toggle a different display setup where I just have the middle monitor active, which allows the video card to run in low power state.

So I'd suggest that as a compromise because it's so easy and quick to flick a key-combo and switch between a single monitor and multimonitor modes, getting the power savings most of the time, but flexibility to use multiple monitors whenever you need/want it.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
I tested using two monitors on my 7850 last week. One is a 1080p using HDMI, the other a 1440x900 using analog. The clocks and power did not change whether one or two monitors were in use. Maybe it got fixed, or maybe it was because the second monitor used analog.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I tested using two monitors on my 7850 last week. One is a 1080p using HDMI, the other a 1440x900 using analog. The clocks and power did not change whether one or two monitors were in use. Maybe it got fixed, or maybe it was because the second monitor used analog.

When you say they didn't change, what do you mean? Did they stay high all the time wasting power even if using just one monitor? Or, do you mean they stayed low all the time and saved power even if using two monitors?
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
When you say they didn't change, what do you mean? Did they stay high all the time wasting power even if using just one monitor? Or, do you mean they stayed low all the time and saved power even if using two monitors?
With either one or two monitors connected and in use on a single card, the card behaved the same. It would idle at about 300 mhz, jumpred to about 450mhz when I ran a youtube video, run at 860mhz when playing Mass Effect 3 (it would drop to 300 when I opened the menu), etc. I saw no difference between one monitor or two.

I wanted to test it with both monitors using digital connections but I can't seem to switch my 1440x900 back to digital.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
TechPowerUP posted clock profiles for Multi-Monitor and Blu-Ray settings that are standard on 7970:

Multi-Monitor
GPU: 500 MHz @ 0.96V
Memory: 1375 MHz

BluRay
GPU: 500 Mhz @ 0.98V
Memory: 1375mhz

For some reason in his power consumption testing, there is no discernible difference between 680/7970 cards with 2 monitors:

power_multimon.gif
 
Last edited:

Colin1497

Junior Member
Apr 25, 2005
3
0
66
I stumbled on this thread and decided to do some super accurate testing (/sarcasm) with my Killl-a-Watt. I run two monitors, 16x10 and 12x10. The second monitor I only use for watching a movie or tv and is usually turned off. Note that turning the 2nd monitor off does not reduce GPU power draw, the monitor connection to the video card has to be unplugged. I also tried running the 16x10 monitor at 12x10 (with no GPU scaling), but this still resulted in 450/1250 clocks.

1090t, 6950 2gb, 8GB, 890GX, everything at stock:

Two monitors:
---
At 3.2GHz, total system draw was 137 watts idle. GPU ran at 450/1250
At 800MHz (CnQ), total draw was 132 watts. GPU at 450/1250

1 monitor:
---
At 3.2GHz, total system draw was 104 watts idle. GPU ran at 250/150
At 800MHz (CnQ), total draw was 99 watts. GPU at 250/150

So, going from 1.3v Vcore to 1.225Vcore (CnQ) is worth about 5 watts. As another aside, running an x4 instead of an x6 would be worth another 15-20 watts saving with CnQ since Vcore will drop to 1.0v on Phenom II quads.

Going from 2 monitors connected (with a different resolution) to 1 is worth about 33 watts. Another way to look at it is going from 450/1250 to 250/150 is worth 33 watts. At 24/7/365 usage and $0.12 kw/hr, these 33 extra watts cost roughly $35 a year. If we factor in s3 sleep and any time not running at idle clocks (since those times the wattage would be the same regardless) to be 50%, then it translates into $17 a year.

The reason I was interested in this is as follows.

As some point last year, something changed either with ATI's UVD drivers or with Microsoft's media foundation codecs/media center. I used to be able to play a game and watch tv simultaneously . Something changed, and now when trying to do those activities together my (at the time) 5850 would run at 400MHZ UVD clocks instead of 725MHz 3D clocks, making both the game and tv stutter horribly. My 5770 and 6950 exhibit this same behavior, as I assume all AMD cards do.

Since I don't do eyefinity gaming, this thread got me wondering about adding a second video card solely to power the second monitor. This should allow both cards to run at the lowest idle clocks for a net savings of about 20w if we assume about 10w idle for low-end 5 series cards. This should also allow full 3D clocks on the main card and UVD clocks on the 2nd card. I suppose I could try using the IGP too, but I worry about OC potential with the IGP enabled. From purely a cost savings standpoint, it might take around 2.5 years to recoup the cost of a 5450 via reduced electricity costs. Arguably not worth it, but two cards will allow for proper 3d clocks with UVD clocks on the second monitor/card.

Edit

Main monitor on 6950, 2nd on 4290 IGP:
---
At 3.2GHz, total system draw was 105 watts idle. 6950 at 450/1250, IGP at ~483/666
At 800MHz (CnQ), total draw was 100 watts. 6950 at 450/1250, IGP at ~483/666

Will run my current OC and update with any stability issues.

Necro thread here, but I found this in a search. FYI, it turns out that the tested configuration (Main monitor on 6950, 2nd on 4290 IGP) is broken by the newer catalyst drivers that don't support the 4000 and below. It seems that you can't get the "legacy" catalyst drivers to work with the newer Catalyst 13.1/13.4 drivers. Unfortunate.

http://forums.amd.com/game/messageview.cfm?catid=454&threadid=166082