Couldn't find any discussion about this with the search function.
Looking at reviews of the Fermi cards I noticed a disturbing issue reported over at legitreviews.com, that the cards do not idle properly when two monitors are attached. Not only that, but according to them the issue is also present on older Nvidia cards and also on ATI cards. I decided to test my system that has a GTX 260 with a wattmeter, and sure enough when I have my second monitor attached the power use is ~30W higher than with just one monitor attached.
Being a long time dual monitor user and not being able to live without a second monitor attached this is quite annoying - Especially since all other reviews I can find do not seem to take this issue into account. For example the Anandtech GTX 480/470 review has this graph:
But it is not mentioned anywhere if this is with one or two monitors attached - I'd guess it is with one though. Same goes for the other idle graphs for temperatures and noise - they cannot be relied upon if you plan on using two monitors.
The whole issue seems strange to me, why do the cards need to use so much more power just because there are two monitors attached to them? The root of the issue seems to be that the cards get locked to 3D clocks when a second monitor is attached and never clock down to idle clocks. Even more strange is that I can manually down-clock my GTX260 with rivatuner and I see no adverse effects, though the power use doesn't go down quite to the levels it does with just one monitor attached... Probably the real single monitor idle clocks drop the voltage as well.
I'd love an Anandtech examination into this issue, especially it would be interesting to have some comments from Nvidia and ATI as to why this is happening and if it is possible to fix it. I believe dual monitor use is very common these days so the issue touches many people.
edit: Oh, here is the legitreviews article. According to it the idle power use of GTX480 is 80W higher with two monitors then with one!
Looking at reviews of the Fermi cards I noticed a disturbing issue reported over at legitreviews.com, that the cards do not idle properly when two monitors are attached. Not only that, but according to them the issue is also present on older Nvidia cards and also on ATI cards. I decided to test my system that has a GTX 260 with a wattmeter, and sure enough when I have my second monitor attached the power use is ~30W higher than with just one monitor attached.
Being a long time dual monitor user and not being able to live without a second monitor attached this is quite annoying - Especially since all other reviews I can find do not seem to take this issue into account. For example the Anandtech GTX 480/470 review has this graph:

But it is not mentioned anywhere if this is with one or two monitors attached - I'd guess it is with one though. Same goes for the other idle graphs for temperatures and noise - they cannot be relied upon if you plan on using two monitors.
The whole issue seems strange to me, why do the cards need to use so much more power just because there are two monitors attached to them? The root of the issue seems to be that the cards get locked to 3D clocks when a second monitor is attached and never clock down to idle clocks. Even more strange is that I can manually down-clock my GTX260 with rivatuner and I see no adverse effects, though the power use doesn't go down quite to the levels it does with just one monitor attached... Probably the real single monitor idle clocks drop the voltage as well.
I'd love an Anandtech examination into this issue, especially it would be interesting to have some comments from Nvidia and ATI as to why this is happening and if it is possible to fix it. I believe dual monitor use is very common these days so the issue touches many people.
edit: Oh, here is the legitreviews article. According to it the idle power use of GTX480 is 80W higher with two monitors then with one!
Last edited: