R9 290 Multi-Monitor Idle Really Really Bad

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
What idle temps do you get with a single card, out of curiosity? I only have a problem with the statement that a card "should" idle at 90C. That is of course nuts. I mean, core clockspeed has a direct correlation with temperatures and generally speaking, core clocks are 150-300mhz while idling. With that being the case, idle temps should not be super high even if the fan speed is lax - 50-60C? Maybe, even though that would make me do a double take. 70-90C? That...I dunno. Not while idle.

At a complete idle, 45-47c depending on ambient temp. But any browser with GPU 2D acceleration or media players cause the memory to jump back and forth from 150 to 1250Mhz so it then goes into the 50-52c range. I just set a new 2D profile that lowers the memory to 625Mhz and now it's back down to 45c even with apps using the GPU. I'll play with adding a negative offset but I'm sure I can get it down to 40-43c which is a much more comfortable number coming from previous AMD cards.
 
Last edited:

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
What? I used Fermi with multiple screens. If the GTS450 was somehow different , maybe that is the case, I dunno, but I remember how the GTX 580 was with multiple screens.

http://www.legitreviews.com/nvidia-geforce-gtx-580-gf110-fermi-video-card-review_1461/19


In any case, 70C while idling (for the 290) is downright stupid. So I dunno what's up with that.

I currently use a GTX 580 and I used Precision X to down clock my dual monitor setup to 405/513 when I'm not gaming.

Can one not do this with a 290?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
50 watts difference for multi-monitor ram speed, at idle?

Can someone verify that with their 290? I didn't think ram could guzzle that much wattage at idle, even if the speed is increased to prevent multi-monitor flicker? Maybe if someone had a kill-a-watt and could check consumption at the wall and see how it changes when you connect and disconnect the 2nd monitor while allowing the computer to remain idle?
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Nvidia users can use nvidia inspector to force their cards into a lower power state when using multimonitor setups.

nvidia-dual-monitor-2.jpg


(thanks again to whoever suggested this to me on here)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
^ I remember using that for my 470s.

That was with a P67 board, with my new setup I just try to use the HD4600 on board.


There are issues with this as well, I was running DVI and it was causing 25-30% cpu usage and not allowing it to downclock. When I switched it to HDMI on the on board it went away..


*shrug*

This is what we do.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
My 680, didn't have this issue either. All the Radeons I've had the last few years do this. Its actually pretty annoying.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Nvidia drivers/GPU powers down better much better on Multi-GPU idle states. Report this "issue" in AMD driver team feedback page(anyone in this forum can tell you the link... i don't know). But i'm say to you this caracteristhic i see in Radeon cards since i started reading reviews on tech sites(About to four years later).
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
But to clarify the issue:

video card memory cannot be downclocked aggressively with multi-monitor configurations, so when idle the video card usually runs the memory at a speed that is faster than idle of just a single monitor.

I think all video cards with graphics memory do this? The question is how much extra wattage will you see at idle when you connect multiple monitors compared to just one?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
But to clarify the issue:

video card memory cannot be downclocked aggressively with multi-monitor configurations, so when idle the video card usually runs the memory at a speed that is faster than idle of just a single monitor.

I think all video cards with graphics memory do this? The question is how much extra wattage will you see at idle when you connect multiple monitors compared to just one?

Check my post on page 1. Good example of a card running same idle with 1 and 2 screens. Its certainly possible.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Check my post on page 1. Good example of a card running same idle with 1 and 2 screens. Its certainly possible.

They are 2 different arch's. You can't compare them and it's OT.

The OP thought there was a problem with his 290. There's not though, it's how the card is designed to function.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
They are 2 different arch's. You can't compare them and it's OT.

The OP thought there was a problem with his 290. There's not though, it's how the card is designed to function.

I didn't think there was any problem with my specific 290. I've seen the high idle memory on my GPUs before. I've just have never seen it cause such high GPU temps and power usage. A video card idling at 70C just seems silly.

As far as the power usage my UPS isn't the most accurate but there is quite a large difference. I've been meaning to pick up a Kill A Watt so maybe I'll measure it a bit more accurately. Also remember this is AC power usage so PSU efficiency needs to be taken into account.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
What are you using the second screen for?

I idle around ~55w with two screens using one 7950.

Have you tried using your HD4600?

playback_zpsee0353a3.png~original
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
What are you using the second screen for?

I idle around ~55w with two screens using one 7950.

Have you tried using your HD4600?

playback_zpsee0353a3.png~original

Interesting I didn't think of that. My second screen is just a small 17" for multitasking/web browsing. Definitely gonna try that when I get home.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I had an issue using it in 8.1 (no MVP), using HDMI on my main screen and DVI off the HD 4600.

I had high system interrupt (probably driver conflicts), when I switched them around for whatever reason the problem ceased to exist.


Other than that it works as you would except, very low power on the HD4600 even less than the 7950.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
That is a very clever way to reduce overall power. Who would think that adding the use of a second, separate video card would have lower overall power consumption of just one of those?
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Some people say and I believe it to be true of my CPU as well , having the iGP enabled lowers your ability to OC the CPU as high.

But you might as well try it out with your setup and see how it goes. It can't hurt.

I personally run a 40" LCD on the wall, a 32" HDTV for my primary, and a 20" LCD for my secondary.

With the GTX680 I could do all of this off of it. Now the 7850 only supports two monitors. When I use the iGP I get an unstable system whether I OC or not. But again if it works for you AWESOME!
 
Last edited:

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
Some people say and I believe it to be true of my CPU as well , having the iGP enabled lowers your ability to OC the CPU as high.

But you might as well try it out with your setup and see how it goes. It can't hurt.

I personally run a 40" LCD on the wall, a 32" HDTV for my primary, and a 20" LCD for my secondary.

With the GTX680 I could do all of this off of it. Now the 7850 only supports two monitors. When I use the iGP I get an unstable system whether I OC or not. But again if it works for you AWESOME!

I already had my IGP enabled for some quicksync stuff. I gave it a try last night and all seems to be going well. If the power usage numbers in gpu-z are correct the IGP really uses a paltry amount of power in 2d.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
I've got an older Sandybridge CPU, its certainty possible its not as much of an issue on Ivy or Haswell.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
+1

Does any one else have a multi-monitor setup and a 290 that could comment?

I have two 27in 1440 Korean panels and a 290X. When completely idle, ie, the panels are in power saving mode and blank, Catalyst reports temps of 45-50C upon immediately waking. After a short period of web browsing and Netflix, I'll usually see temps between 68C-72C. Not exactly idle, but a far step down from gaming. Watching the Catalyst utility, I'll see the core clock jump from ~300Mhz to as high as 600Mhz while watching video as well. The drivers never increase the fan speed above 20% though, I imagine if I used Afterburner or some other utility to increase the minimum fan speed, the temp would drop.

I'm still waiting on my Extreme III cooler though, but the ~70C 'idle' temps weren't high on my concerns.
 

birthdaymonkey

Golden Member
Oct 4, 2010
1,176
3
81
I had a GTX 670 with two Dell U2412Ms. It would downclock to 300/150 even with both screens running. But in Windows 7 on my system, this introduced 2D performance issues - choppy window animations and flash videos, for example. I fixed it by running one screen off the integrated graphics.

The problem went away in Windows 8, which actually gave smoother 2D performance with both displays plugged into the 670.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Can't believe it really takes 50W extra to display to a second screen, being as any old integrated graphics can do 2 screens and take < 10W, probably < 5W.

If it can't clock down properly then it's just badly designed. This is a new card - everyone has multiple screens these days - surely they had the common sense to design it with the ability to run at sensible clocks for 2 screens not just one?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
This card wants to be at 90'C. If you want to lower temps, you need to make a custom fan profile.
Just like Klown12 said, 4GB of GDDR5 @fullspeed need some juice.

Not that much juice. Heck there are versions of laptops with 4 GB GDDR5 at similar clockspeeds with much lower power consumption.

http://www.notebookcheck.net/Review-Lenovo-IdeaPad-Y510p-Notebook.97470.0.html

Pulls around 105W in 3dmark 06 with SLI 750m 2GB GDDR5 @ 5000 mhz for the whole system. This seems to be driver architecture issue.