[PCPER] Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Alright here's my findings from a GTX 970 and a 1080p XB240H G-Sync monitor.

Pretty interesting! I guess 144hz really does just suck up the power. I guess I'll keep my default hz to 120 until this mess is fixed.

Edit: second monitor is interfering, see results below.

Could you try do a forced down clock while keeping your second monitor connected. Just to see what the affects may be.
 

the unknown

Senior member
Dec 22, 2007
374
4
81
Could you try do a forced down clock while keeping your second monitor connected. Just to see what the affects may be.

Power limit at 50%
E0rXwic.jpg


Downclocked core and memory as far as it would go.
F69nAae.jpg


Nothing changed but a few % points on the power usage.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Ok thanks, so essentially you couldn't down clock far enough to make a difference. Like reaching idle clocks.
 

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
Anyone here with a 780ti and a 1080p 144z monitor. Mine won't downclock as well in this setup and want to confirm this.

Also to Karlitos' point, doing a quick google search this seems to have been a somewhat known issue dating back to 2013 in regards to 144hz monitors.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
I have this same issue from time to time with Afterburner. In Afterburner, if you enable automatic profile loading 2D/3D...then at times if you click on the information icon in AB you will see a program being listed as being detected as a 3D process while in desktop mode causing AB to go into the 3D profile and increase power consumption.

Wonder if using Afterburner to force 2D mode will help (if possible), or if there is a process related to the video driver that is being reported as a 3D process that you can tune Riva Tuner to ignore.
 

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
After messing around with this for about 15 minutes, found I could correct it by adding RTSS to the nvidia control panel and then forcing adaptive power management. Of all things...

Edit: Still getting spikes back up to 1020mhz here and there
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
After messing around with this for about 15 minutes, found I could correct it by adding RTSS to the nvidia control panel and then forcing adaptive power management. Of all things...

Tada....Thought that may be a possibility.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I can get this to happen when running a tri-monitor setup (2160p, 1440p, 1080p). Let me show you how increasing the GPU clock from 324MHz to 810MHz completely ruins my computing experience.

<snip>

34% fanspeed is the lowest I can set it at, and is silent in my usage. After enabling the 3rd monitor, you can see the fan speed shot up to 36%. Those extra 50RPM's make it now sound like an OEM 290x. Absolutely unbearable. The temperature part is even worse. It rose from 31C up to 43C. All the paper on my desk spontaneously combusted and I now have a really nice farmer's tan on the right side of my body where my PC is. In reality, that 13C rise in temperature is totally irrelevant when added to the heat being generated by a 50" plasma TV, 39" 4k monitor and 27" 1440p monitor as well, as the rest of the computer and HT equipment the PC is connected to.

So, basically your attempts to make this into some major deal are unfounded and childish. If you weren't looking at monitoring software, you would never notice the difference on my system.

Haha, nice! This instantly made me think of this:

https://33.media.tumblr.com/fc0268b9b6e1140b7787b2f4c45eb480/tumblr_n3xh291T9l1s3zvf8o1_500.gif

EDIT:

I also wouldn't say 13c isn't a big issue, depends on your room and what not. However, what I find interesting is I don't recall is the same complaints when AMD broke Zero Core. Which basically ran your second GPU at 99% load unless you disabled ULPS which changed your system from idle-GPU1 and off-GPU2 to idle_both_GPUs. That really sucked.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
EDIT:

I also wouldn't say 13c isn't a big issue, depends on your room and what not. However, what I find interesting is I don't recall is the same complaints when AMD broke Zero Core. Which basically ran your second GPU at 99% load unless you disabled ULPS which changed your system from idle-GPU1 and off-GPU2 to idle_both_GPUs. That really sucked.


Is this to deflect from the issue or something? That's not what Zerocore does. Anyways zerocore is not really related to multi gpu specifically. The thing that deals with shutting off slave gpus is ULPS. The problem with the 2nd gpu going full load happened to some ppl but how it happened I dunno. I think it had to do with their inexperience and using AB incorrectly. Also, AMD changed how their driver worked and AB at first was not updated to access the driver libraries properly, hence it "looked" all wonky, aka unified usage monitoring. That said it was a software issue unlike Nvidia's problem here.

With Southern Islands AMD is introducing ZeroCore Power, their long idle power saving technology. By implementing power islands on their GPUs AMD can now outright shut off most of the functional units of a GPU when the GPU is going unused, leaving only the PCIe bus interface and a couple other components active. By doing this AMD is able to reduce their power consumption from 15W at idle to under 3W in long idle, a power level low enough that in a desktop the power consumption of the video card becomes trivial. So trivial in fact that with under 3W of heat generation AMD doesn&#8217;t even need to run the fan &#8211; ZeroCore Power shuts off the fan as it&#8217;s rendered an unnecessary device that&#8217;s consuming power.
*Also, I'd add that the reason we turn off ULPS, and by extension Zerocore on the slave cards is due to overclock apps being incompatible with slave cards that have gone to sleep, poof cards missing. The two are fundamentally at odds, nevermind that in years past unofficial method was very popular and if you went to oc and ULPS was on, it meant instant BSOD.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Is this to deflect from the issue or something? That's not what Zerocore does. Anyways zerocore is not really related to multi gpu specifically. The thing that deals with shutting off slave gpus is ULPS. The problem with the 2nd gpu going full load happened to some ppl but how it happened I dunno. I think it had to do with their inexperience and using AB incorrectly. Also, AMD changed how their driver worked and AB at first was not updated to access the driver libraries properly, hence it "looked" all wonky, aka unified usage monitoring. That said it was a software issue unlike Nvidia's problem here.

I'm more talking about idle clocks. I know what Zero Core is, I loved it when I had CFX 7970s. It kept my second GPU off and my primary GPU at idle clocks (since I only used one monitor). When they came out with the frame pacing beta's it basically broke Zero Core and suddenly sitting at the desktop I had one GPU at idle clocks and the other at full load clocks.

It had zero to do with MSI AB, during my trouble shooting steps first thing I did was uninstall all overclocking software. Just using AMD CCC and the overdrive panel to monitor my hardware, second GPU stayed at full clocks even if just on the desktop.

Reverting back to the previous driver before frame pacing fixed the issue.

*Also, I'd add that the reason we turn off ULPS, and by extension Zerocore on the slave cards is due to overclock apps being incompatible with slave cards that have gone to sleep, poof cards missing. The two are fundamentally at odds, nevermind that in years past unofficial method was very popular and if you went to oc and ULPS was on, it meant instant BSOD.

I happily used MSI AB for monitoring and overclocking with CFX GPU without issue. Frame Pacing beta dropped and suddenly my second card was full load. My options were:
Revert back to old driver, magically Zero Core + MSI AB + Overclocks worked
or
Used Frame Pacing driver + ULPS to disable Zero Core + 2 GPUs at idle while on desktop

Thankfully, Frame Pacing broke CFX scaling with some games I played so I happily reverted back to the old driver :D
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I can concur that some drivers and there were a lot of drivers coming out back then that some were broken. When they were pushing those frame pacing drivers, teething is expected. Luckily there were many drivers to choose from. I didn't change drivers often unless they were proven trouble, and only if they were validated for 3dmark scoring. I was still benching back then. But yea back on topic, it's lucky for us it was just a software issue because all the drivers now are frame pacing.
 
Feb 19, 2009
10,457
10
76
As for NVIDIA cards consuming more power at 144 Hz in idle mode, that's an issue albeit one that only affects a small subset of the overall market. It can't be equated with AMD's power consumption articles all these years. In fact, everyone here routinely attacks PCPer for being NVIDIA biased yet they are the one's that brought this to light, so which media outlets were being unfair?

That's probably because very few people knew about the issue, I didn't. I don't know any gamer buddies who use 144hz. Just 120hz or less or 4K.

Probably the same for most review sites, they have a test rig for that purpose and it is unlikely to be running a 144hz monitor.

It's an issue that only affects a small minority so this won't do any damage to NV's image.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
That's probably because very few people knew about the issue, I didn't. I don't know any gamer buddies who use 144hz. Just 120hz or less or 4K.

Probably the same for most review sites, they have a test rig for that purpose and it is unlikely to be running a 144hz monitor.

It's an issue that only affects a small minority so this won't do any damage to NV's image.

I'm one of the subset that this could affect but I run a custom vbios that pegs my cards at high clocks so it doesn't matter much to me. I'm sure there are a few out there that would be upset at the cards using more power than they should at idle but I doubt they'll be very vocal about it even after this article.
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
Apparently this is a complete non-issue for those who find out about it and ask in any 144hz display thread or Nvidia forum:

Set the monitor to 120hz and use 120hz on desktop/apps.
Open the nvidia control panel and go to the "Manage 3D settings" section.
Set the
"Preferred refresh rate" of the monitor to "Highest Available" setting.
Hit "Apply".

Now when you launch a game it will use the highest available settings, 144hz, and it will drop back to idle-capable 120hz when you drop back out of the game to the desktop.

You can also utilize this (if you wanted to for some reason) to run ulmb mode on desktop but not in games, since ulmb mode only works at 120hz or 100hz and highest available would jump to 144hz (and g-sync if enabled previously) again on game launch.

http://hardforum.com/showthread.php?t=1854605
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
Apparently this is a complete non-issue for those who find out about it and ask in any 144hz display thread or Nvidia forum:



http://hardforum.com/showthread.php?t=1854605


It's not really about the workaround because you can't fix it in multi mon. In multi mon setups like surround which is like 3% of the user base at most you save 30w/hour in gaming yet lose 60w/hour browsing. Still it's interesting and ironic. How many hours spent gaming vs idling a day?

Basically at the highend which is like the 3% er's, no one should be up in arms over power consumption cuz yall run superoverclocked, watercooled, some crazies run single stage 24/7, etc etc.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That's probably because very few people knew about the issue, I didn't. I don't know any gamer buddies who use 144hz. Just 120hz or less or 4K.

Probably the same for most review sites, they have a test rig for that purpose and it is unlikely to be running a 144hz monitor.

It's an issue that only affects a small minority so this won't do any damage to NV's image.

Any site that reviewed the Swift would have known it. None of them reported it, though.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
The topic is about higher Nvidia power draw at increased refresh rates (144hz), not multi-mons which affect both sides. :)

The 144hz is a minor issue with workaround for sure, but multi-monitor is handled better by AMD and Fury (at least for 2x monitor @120hz and 3x monitor @ any hz). AMD fury can now keep low desktop idle gpu clock ~300hz with 2x 120hz+ monitors or 3x monitors which Nvidia can't do at all currently. Not a deal breaker for most, but definitely note worthy for any review that takes power consumption into consideration.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
http://www.pcper.com/news/Graphics-...wer-Increases-High-Refresh-Rates-Promises-Fix

NVidia confirmed they will fix this in an upcoming driver.

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors. Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates. As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays. We&#8217;ll have this fixed in an upcoming driver.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126