Dual monitor idle power use of ATI/Nvidia cards

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
Couldn't find any discussion about this with the search function.

Looking at reviews of the Fermi cards I noticed a disturbing issue reported over at legitreviews.com, that the cards do not idle properly when two monitors are attached. Not only that, but according to them the issue is also present on older Nvidia cards and also on ATI cards. I decided to test my system that has a GTX 260 with a wattmeter, and sure enough when I have my second monitor attached the power use is ~30W higher than with just one monitor attached.

Being a long time dual monitor user and not being able to live without a second monitor attached this is quite annoying - Especially since all other reviews I can find do not seem to take this issue into account. For example the Anandtech GTX 480/470 review has this graph:

22208.png


But it is not mentioned anywhere if this is with one or two monitors attached - I'd guess it is with one though. Same goes for the other idle graphs for temperatures and noise - they cannot be relied upon if you plan on using two monitors.

The whole issue seems strange to me, why do the cards need to use so much more power just because there are two monitors attached to them? The root of the issue seems to be that the cards get locked to 3D clocks when a second monitor is attached and never clock down to idle clocks. Even more strange is that I can manually down-clock my GTX260 with rivatuner and I see no adverse effects, though the power use doesn't go down quite to the levels it does with just one monitor attached... Probably the real single monitor idle clocks drop the voltage as well.

I'd love an Anandtech examination into this issue, especially it would be interesting to have some comments from Nvidia and ATI as to why this is happening and if it is possible to fix it. I believe dual monitor use is very common these days so the issue touches many people.

edit: Oh, here is the legitreviews article. According to it the idle power use of GTX480 is 80W higher with two monitors then with one!
 
Last edited:

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
My ears are on fire.:)

I had someone ask a similar question back in November about the 58xx series and how they too run at a higher performance state with multiple monitors plugged in. As far as I know the GTX 480 series is affected by the same issues the 58xx series is, so the technical explanation is the same. So here's that response:

If you recall our 5800 series article, we mentioned that AMD finally has the ability to change the clock speeds on GDDR5, using fast link retraining. In order to make FTR work without disrupting any images, FTR needs to be done during a v-blank period so that the monitor isn’t reading from the front buffer, as the front buffer will be momentarily unavailable during the FTR. This is very easy to accomplish when you only have 1 monitor, because there’s only 1 v-sync cycle to deal with.

The issue is that with multiple monitors, there’s no guarantee that the all of the monitors will be perfectly synchronized. It’s possible for the monitors to be out of sync (particularly when using different display types, i.e. a DVI and a DP), which results in flickering on any monitors not in sync with the monitor the FTR was timed off of. This is the flickering you see when you have an overclocked card, as the card is accidentally switching GDDR5 speeds when it shouldn’t be. [At the time, a card overclocked with CCC would not go in to the correct 2-monitor PP idle state]

So the reason AMD keeps cards at a higher state when multiple monitors are attached is to preventing that flickering. This means at a minimum keeping the GDDR5 at whatever it defaults to (1000mhz/1200mhz). I’m not entirely sure why the GPU is kept at a higher state too, but my best guess is that there may be performance issues with trying to draw to 2 large monitors at such low clock speeds. Or it may be that this is just easier than creating another powerplay state.

Ultimately there is no "fix" because this is not a bug. If anything it's a limitation of TDMS technology. I believe the FTR issue can be fixed with DisplayPort since it uses 1 shared clock, but I don't have a pair of DP monitors on hand to test that right now.
 

pitz

Senior member
Feb 11, 2010
461
0
0
Interesting issue. Is this something that would show up on, for instance, a laptop Quadro NVS140M card? Driving 2 1920x1200 LCDs through DVI?
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
Hmm... do both monitors actually need to be on for this higher consumptions or is being plugged into the video card enough? I ask because my larger HDTV is always plugged into the HDMI slot but rarely turned on to be used (and my smaller tv is always off when I use the larger one.) If I'm using way more power always, I might consider only plugging the HDMI in when I need to use it.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
That was fast!

Thank you very much. Already there are more questions i see :)

Maybe we could get Kyle or someone who has reviewed the Eyefinity6 version of 5870 to give a headsup on your assumption about DP panels fixing FTR.

I think i need to register at Hardopc forums too.
 

gitano

Member
Aug 4, 2008
93
0
61
I might consider only plugging the HDMI in when I need to use it.

I do same but on a 4850, you can just disable the monitor not used from the computer, on Vista or Windows 7, just winkey+P and can leave the hdmi plugged.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Hmm... do both monitors actually need to be on for this higher consumptions or is being plugged into the video card enough? I ask because my larger HDTV is always plugged into the HDMI slot but rarely turned on to be used (and my smaller tv is always off when I use the larger one.) If I'm using way more power always, I might consider only plugging the HDMI in when I need to use it.

My own test (I reached up and turned off 2 monitors) the card stayed clocked at 400/1200. It may work different with a TV since they behave a little differently than monitors, just check your clocks with the TV off. I think you'll have to go into driver settings and disable the monitor to get the clock back down while the TV is off, you can easily set up a profile w/hotkey to do this too. Winkey+P also worked for me.
 

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
I decided to test my system that has a GTX 260 with a wattmeter, and sure enough when I have my second monitor attached the power use is ~30W higher than with just one monitor attached.

I'm pretty sure that is just the second monitor pulling the additional 30W unless you have the PC and monitors connecting into different plugs.

I've read about dual monitor issues with the 58XX line (powerplay downclock is too low to drive dual displays) and GTX 4XX line (RAM stays at 3D clocks), but nothing regarding the GTX 2XX line.

I'd recommend using GPU-Z though to see VDDC Current usage and whether clocks actually change going from single to dual monitor with your GTX 260. the GTX 260 also uses DDR3 memory. the 58XX and GTX 4XX line use DDR5 RAM, and DDR5 RAM is the reason for the 480's high idle power usage.
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I'm pretty sure that is just the second monitor pulling the additional 30W unless you have the PC and monitors connecting into different plugs.

I've read about dual monitor issues with the 58XX line (powerplay downclock is too low to drive dual displays) and GTX 4XX line (RAM stays at 3D clocks), but nothing regarding the GTX 2XX line.

I'd recommend using GPU-Z though to see VDDC Current usage and whether clocks actually change going from single to dual monitor with your GTX 260. the GTX 260 also uses DDR3 memory. the 58XX and GTX 4XX line use DDR5 RAM, and DDR5 RAM is the reason for the 480's high idle power usage.

The 480 stays at full core clock too when running dual monitors :(
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
Ryan: Thanks for the information. It sounds like an issue that could at least potentially be fixed in future hardware if Nvidia/ATI would put some effort to it. Maybe even the GPU could be "underclocked" and the memory left untouched, I would imagine that could save quite a bit of power. Would it be possible for future Anandtech GPU articles to have the dual monitor idle power use be tested as well? I'd personally wish to upgrade to a GTX470 in the near future, but as I require a silent and reasonably low power system when idling it is too risky to buy one without knowing the dual monitor idle power use first and nobody seems to have tested that.

Would be interesting to know if using displayport fixes the issue as well...

I'm pretty sure that is just the second monitor pulling the additional 30W unless you have the PC and monitors connecting into different plugs.

Power use is measured for the system only of course. The GPU clocks get locked to the 3D performance clocks (Monitored with Rivatuner), it happens immediately when enabling the second display.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The HD4800 series and GTX200 series use the full 3D clocks when using Dual monitors. However for the HD5800 and GTX400 series, they dont use the full 3D clocks but a lower core/shader clock + full GDDR5 clock frequency. The new BIOs on the retail cards have a new fan profile when using dual monitors.
 

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
The HD4800 series and GTX200 series use the full 3D clocks when using Dual monitors. However for the HD5800 and GTX400 series, they dont use the full 3D clocks but a lower core/shader clock + full GDDR5 clock frequency. The new BIOs on the retail cards have a new fan profile when using dual monitors.
I was going to respond, but the above is what I wanted to say. AMD and NVIDIA already reduce the core clock to a lower level on their latest parts. It's not as low as their full idle clocks, but rather it's their mid-power state. At this point I believe that if they could get their core clocks even lower they would, since the GPU itself is still the biggest power draw.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I was going to respond, but the above is what I wanted to say. AMD and NVIDIA already reduce the core clock to a lower level on their latest parts. It's not as low as their full idle clocks, but rather it's their mid-power state. At this point I believe that if they could get their core clocks even lower they would, since the GPU itself is still the biggest power draw.

Thanks for the info.

Does anyone here have power comparison graphs of HD5850/70 and 470/480 GTX with dual monitors turned on compared to single monitor?

What happens to idle power consumption if the monitor or monitors are shut off, but still plugged into the video card?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
On my 5850, unless you're actively driving the 2nd monitor, it acts as if there's only 1 monitor. So it can be plugged in (and even turned on) as long as you don't have Windows set to actively use it.
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
Legitreviews has tested the retail GTX 470 and apparently the dual monitor power consumption numbers are just as bad as with the GTX 480 :( On idle there is a 75W increase in power use, temperature goes up 30C and 12% increase in fan speed (source).
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
Interesting, it seems Nvidia has done something in the new 257.21 drivers; My GTX 260 now properly drops to the same idle clocks when using dual monitors that it does with just one monitor. I haven't measured the power draw but the voltage regulator amperage shown with Rivatuner drops to the same level as well! It only seems to work when both monitors are using the same display mode though.

Anyone with a 470/480 that could test if this works with them as well? Would be excelent news if it did.
 

sgrinavi

Diamond Member
Jul 31, 2007
4,537
0
76
Using 257.21 here; when I have two monitors (each on their own card) both of my gtx 480's down-clock to a crawl on Core, Memory & Shader , when I add a 2nd monitor to either card the core clock of just that card will drop back to about 50%, but the other clocks stay at 100%. The card with the 2nd monitor is reporting 8 more amps and 10 more watts than the second card.
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
I just upgraded my GTX 260 to a GTX 460, and the clocks drop to proper idle clocks* with it as well as long as the two monitors have the same resolution & refresh rate. It seems to me that nvidia has silently made some global fix for the issue in the recent drivers, just it requires some specific monitor combo to work maybe?

* 50/50/100MHz core/shader/memory
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I use two monitors and the GTX 480 that is handling video output when idle downclocks to 405 on the core but remains at the full memory speed of 1850 and stays at the full voltage the card would use under load. It idles at 62C or so :/

This vs the second card which does properly downclock to 50 on the core, 135 on the memory and lowers its voltage to .96 and idles at 44C or so.
 

Kalakissa

Junior Member
Jun 14, 2006
9
0
0
I upgraded from the GTX 460 to a Radeon 6970 and figured I'd post some findings here from the "red side" as well in case anyone is interested.

It seems that AMD does not have the same feature as Nvidia has, where if all monitors use same resolution & refresh rate the idling happens correctly. With the 6970, no matter what multimonitor config I try, the memory clock never drops at all and core clock only goes down to 500MHz (as opposed to 250MHz core, 150MHz mem with just single monitor attached). This happens even if all monitors are using DisplayPort, although I have only one monitor which supports DisplayPort natively, the other I tried with a displayport->HDMI adapter... Would be interesting to know if having native displayport on all monitors could possibly fix the issue.

This is slightly annoying as the idle power use of the 6970 seems to go up quite a lot as a result.
 

Jovec

Senior member
Feb 24, 2008
579
2
81
I ran a basic test a while back.

I stumbled on this thread and decided to do some super accurate testing (/sarcasm) with my Killl-a-Watt. I run two monitors, 16x10 and 12x10. The second monitor I only use for watching a movie or tv and is usually turned off. Note that turning the 2nd monitor off does not reduce GPU power draw, the monitor connection to the video card has to be unplugged. I also tried running the 16x10 monitor at 12x10 (with no GPU scaling), but this still resulted in 450/1250 clocks.

1090t, 6950 2gb, 8GB, 890GX, everything at stock:

Two monitors:
---
At 3.2GHz, total system draw was 137 watts idle. GPU ran at 450/1250
At 800MHz (CnQ), total draw was 132 watts. GPU at 450/1250

1 monitor:
---
At 3.2GHz, total system draw was 104 watts idle. GPU ran at 250/150
At 800MHz (CnQ), total draw was 99 watts. GPU at 250/150

So, going from 1.3v Vcore to 1.225Vcore (CnQ) is worth about 5 watts. As another aside, running an x4 instead of an x6 would be worth another 15-20 watts saving with CnQ since Vcore will drop to 1.0v on Phenom II quads.

Going from 2 monitors connected (with a different resolution) to 1 is worth about 33 watts. Another way to look at it is going from 450/1250 to 250/150 is worth 33 watts. At 24/7/365 usage and $0.12 kw/hr, these 33 extra watts cost roughly $35 a year. If we factor in s3 sleep and any time not running at idle clocks (since those times the wattage would be the same regardless) to be 50%, then it translates into $17 a year.

The reason I was interested in this is as follows.

As some point last year, something changed either with ATI's UVD drivers or with Microsoft's media foundation codecs/media center. I used to be able to play a game and watch tv simultaneously . Something changed, and now when trying to do those activities together my (at the time) 5850 would run at 400MHZ UVD clocks instead of 725MHz 3D clocks, making both the game and tv stutter horribly. My 5770 and 6950 exhibit this same behavior, as I assume all AMD cards do.

Since I don't do eyefinity gaming, this thread got me wondering about adding a second video card solely to power the second monitor. This should allow both cards to run at the lowest idle clocks for a net savings of about 20w if we assume about 10w idle for low-end 5 series cards. This should also allow full 3D clocks on the main card and UVD clocks on the 2nd card. I suppose I could try using the IGP too, but I worry about OC potential with the IGP enabled. From purely a cost savings standpoint, it might take around 2.5 years to recoup the cost of a 5450 via reduced electricity costs. Arguably not worth it, but two cards will allow for proper 3d clocks with UVD clocks on the second monitor/card.

Edit

Main monitor on 6950, 2nd on 4290 IGP:
---
At 3.2GHz, total system draw was 105 watts idle. 6950 at 450/1250, IGP at ~483/666
At 800MHz (CnQ), total draw was 100 watts. 6950 at 450/1250, IGP at ~483/666

Will run my current OC and update with any stability issues.
 

Udgnim

Diamond Member
Apr 16, 2008
3,680
124
106
The HD4800 series and GTX200 series use the full 3D clocks when using Dual monitors. However for the HD5800 and GTX400 series, they dont use the full 3D clocks but a lower core/shader clock + full GDDR5 clock frequency. The new BIOs on the retail cards have a new fan profile when using dual monitors.

I've set my reference 4870 512 to downclock to 600/200 GPU/RAM with ATI Tray Tools when in 2D with dual monitors. it seems like AMD played it safe and didn't try to be aggressive at all with downclocking the 4800 series.

edit: should have looked at dates more closely. this is a 1 year old bump.
 
Last edited: