[PCPER] Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Azix

Golden Member
Apr 18, 2014
1,438
67
91
http://www.pcper.com/news/Graphics-...raw-Increased-Refresh-Rates-using-ASUS-PG279Q

980Ti

powerdraw.png

powerdraw2.png


Fury Strix
powerdrawamd1.png

powerdrawamd2.png
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Just from what you posted,

Why does the fury top out at 142hz?
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I think it's more reasonable to look at that the other way. That is a huge power savings, from exploiting lower clock speeds using when using lower refresh rates.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
Just from what you posted,

Why does the fury top out at 142hz? It uses over 100% more power according to those graphs. Hmm


You're reading the graph wrong. That's not a % power graph, it's clock frequency in mhz. And the Fury is not topping out at 142hz, that's just as far as it showed in the graph. Are you selectively critical or something? The ti also "tops out" at 142 in their graphs. You want to be a stickler, send an email to pcper?
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I think it's more reasonable to look at that the other way. That is a huge power savings, from exploiting lower clock speeds using when using lower refresh rates.

Or you don't worry about increased power consumption if you're running a Fury? Did the article not make sense to you? It should be common knowledge that high refresh on the desktop causes increased clocks, but no one ever really measured it to be sure.
 

Udgnim

Diamond Member
Apr 16, 2008
3,665
112
106
so if a Nvidia user has a 144 Hz monitor, then their GPU probably ends up drawing more power than an AMD card in the long run?
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
I always attributed the increased power draw/high idle clocks when not gaming @ 144Hz to some sort of bug in Nvidia's drivers.

If you have a 144Hz panel and a Ti, leaving the panel at 144Hz when not gaming/watching video/using hardware acceleration will cause the core to idle ~850MHz.

However, if the user sets their refresh rate to 120Hz while not gaming, the idle core clock falls way down to ~135MHz, undoubtedly saving on power.

Maybe it isn't a bug? Maybe it is? Not sure.
 

Osjur

Member
Sep 21, 2013
92
19
81
I always attributed the increased power draw/high idle clocks when not gaming @ 144Hz to some sort of bug in Nvidia's drivers.

If you have a 144Hz panel and a Ti, leaving the panel at 144Hz when not gaming/watching video/using hardware acceleration will cause the core to idle ~850MHz.

However, if the user sets their refresh rate to 120Hz while not gaming, the idle core clock falls way down to ~135MHz, undoubtedly saving on power.

Maybe it isn't a bug? Maybe it is? Not sure.

That only applies to 144hz and 1440p monitors. You can run 1080p monitor @ 144hz and still have idle clocks. Of course you have like 75% more pixels to push at 1440p so that has to be reason why the clocks increase.

One user at other forum tried to lower clocks with inspector and got green screen when running 1440p @ 144hz so something funky is going on.
 
Last edited:

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I always attributed the increased power draw/high idle clocks when not gaming @ 144Hz to some sort of bug in Nvidia's drivers.

If you have a 144Hz panel and a Ti, leaving the panel at 144Hz when not gaming/watching video/using hardware acceleration will cause the core to idle ~850MHz.

However, if the user sets their refresh rate to 120Hz while not gaming, the idle core clock falls way down to ~135MHz, undoubtedly saving on power.

Maybe it isn't a bug? Maybe it is? Not sure.

It must be a bug as just tried it on single rog swift PG278Q monitor with 120hz (idle goes to 135MHz) and 144hz (810Mhz). The same idle increase happens going from dual to triple monitor (60hz on all monitors) so not much I can do since I run a triple monitor setup. Also, G-Sync is enabled in desktop mode so would assume Nvidia can get some power savings from setting idle lower by changing FPS to monitor's G-Sync minimum while user\desktop apps is doing nothing. In any case good find by PCPER which should lead to Nvidia fixing issue pretty quickly.
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
There is more to this issue than just resolution and refresh rate. Modifying EDID values in CRU ends up in a totally different behaviour between Nvidia and AMD. For example, my 7950 would never go to idle clocks if I modified my front porch/back porch and sync width in one direction. This end up with allowing a maximum 74hz refresh rate when using AMD and keeping idle clocks in my LG IPS monitor and 76hz max refresh rate on Nvidia which apparently on my 760 and 960 didnt show this behaviour.
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Am I the only one who missed something? I'm not seeing an increase beyond about 1 watt. Who could possibly care?
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
@moonbogg

On the first chart in op, the 75% increase in idle system wattage(77w to 134w) from going from 120hz to 144hz.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
so if a Nvidia user has a 144 Hz monitor, then their GPU probably ends up drawing more power than an AMD card in the long run?

This is the important implication. Most people probably are running idle most of the time so the power situation flips around hugely (though they use similar when loaded already). Just the perception of lower power on nvidia should be re-evaluated.

That only applies to 144hz and 1440p monitors. You can run 1080p monitor @ 144hz and still have idle clocks. Of course you have like 75% more pixels to push at 1440p so that has to be reason why the clocks increase.

One user at other forum tried to lower clocks with inspector and got green screen when running 1440p @ 144hz so something funky is going on.

Interesting if that is true. They should have done more testing for that article. On a weaker GPU would it be the same situation at 1080p? etc
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Interesting if that is true. They should have done more testing for that article. On a weaker GPU would it be the same situation at 1080p? etc

Seems to be a total desktop pixel rate threshold to determine if idle clock is 150Mhz or 810Mhz. I see the same increase on 2 1440p monitors (@60hz stays at 150Mhz) to 3 1440p (@60hz goes to 810Mhz) or 2 1440p monitors(@120hz at 810Mhz). Two 1440p monitors (one at 120hz and other at 60hz) is at 150Mhz idle so would think one 1440p at 144hz would mean 150Mhz as well. If not a bug then another limitation to keep away artifacts\issues seen when idle clock is too low.
 

amenx

Diamond Member
Dec 17, 2004
4,082
2,355
136
Sounds like something correctable with a driver update. From 120hz (@ 135mhz GPU) to 144hz @ 885mhz is just silly.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
Seems to be a total desktop pixel rate threshold to determine if idle clock is 150Mhz or 810Mhz. I see the same increase on 2 1440p monitors (@60hz stays at 150Mhz) to 3 1440p (@60hz goes to 810Mhz) or 2 1440p monitors(@120hz at 810Mhz). Two 1440p monitors (one at 120hz and other at 60hz) is at 150Mhz idle so would think one 1440p at 144hz would mean 150Mhz as well. If not a bug then another limitation to keep away artifacts\issues seen when idle clock is too low.


Yea, it's probably a CYA to prevent artifacts. On AMD I'm able to run triple 1080@144hz w/ only a bump to my main card's memory clocks though w/o artifacts.

VdcODob.jpg
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Thanks thesmokingman, it's good to know AMD doesn't have this problem so maybe Nvidia can fix on their side. Sucks that I've been wasting all this power since last September when I bought Rog swifts.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Interesting is that all these users who bought nVidia because it was more efficient and cooler and quieter, etc... never noticed the additional noise and heat.

I wonder if there will be any outrage across the webs? For years now the press have let us down by not informing everyone.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Interesting is that all these users who bought nVidia because it was more efficient and cooler and quieter, etc... never noticed the additional noise and heat.

I wonder if there will be any outrage across the webs? For years now the press have let us down by not informing everyone.

People who run 1440p @144hz isn't a vast majority of users so I'd assume most people using Nvidia are running at 150mhz @idle. I do like how AMD can run triple 1080p @ 144hz at pretty low clock rates (I'm hoping same holds true with AMD at 3x 1440p @ 144hz).
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
People who run 1440p @144hz isn't a vast majority of users so I'd assume most people using Nvidia are running at 150mhz @idle. I do like how AMD can run triple 1080p @ 144hz at pretty low clock rates (I'm hoping same holds true with AMD at 3x 1440p @ 144hz).

We've had the Swift around for years. None of the reviewers noticed/reported it? We've had people very animate that they wouldn't buy AMD because of power draw. Here we have huge power usage at idle, where the PC's spend a majority of their time! Surely they will be absolutely livid they've been deceived. No way nVidia wasn't aware.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
We've had the Swift around for years. None of the reviewers noticed/reported it? We've had people very animate that they wouldn't buy AMD because of power draw. Here we have huge power usage at idle, where the PC's spend a majority of their time! Surely they will be absolutely livid they've been deceived. No way nVidia wasn't aware.

That is a good point about people not buying AMD because of power draw and I agree this is a pretty big issue which I hope gets fixed soon. I'll give reviewers the benefit of the doubt as the 144hz power usage issue may have not been there at Rog swift release. I know I had issues with Rog Swift and Chrome crashing PC a few months back. The issue was with G-Sync and fixed with drivers so maybe increased idle clock was the fix. If drivers at Rog swift release had this issue then yeah the reviewers (general hardware sites and not monitor centric ones like tft cemtral) should have noticed.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Is no one reading this graph right? Nvidia power consumption is crazy at higher refresh rates while the strix doesn't even budge?

Am I reading this wrong or are we all downplaying how much less power amd is using compared to nvidia?

Any power consumption savings are thrown out the window to me for nvidia while gaming if this is the case(for high refresh rate gamers. This sadly doesn't apply to me since I can't get a high refresh rate panel at 4k).
 
Last edited:

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Nice attempts by "certain" members to "brew" things up. This was very well known in the community, on this very forum you can find people (including me...) recommending 120hz instead of 144hz for windows desktop and running 144hz in games where it matters. Tiny inconvenience, but 120hz desktop is light year ahead of 60hz...


So yeah, NV uses more power once over 120hz and no, noone sane runs windows desktop on 120hz on them due to this problem. Hopefully it will get fixed, but not a deal breaker in any way.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Keep in mind this seems to be only at 1440p so far. For the 980ti anyway. If you are running 1080p 144hz it doesn't happen. I haven't seen anyone say what happens with 4K 60hz or weaker GPUs at 1080p 144hz.

searching google, this has been popping up here and there since 2014 at least.

http://www.overclock.net/t/1497172/...emperatures-and-power-draw-on-your-nvidia-gpu

https://www.reddit.com/r/nvidia/comments/38vonq/psa_nvidia_users_with_a_high_refresh_rate_monitor/

Seems earlier/weaker cards did it at 1080p and even at 120Hz
 
Last edited: