[PCPER] Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Is no one reading this graph right? Nvidia power consumption is crazy at higher refresh rates while the strix doesn't even budge?

Am I reading this wrong or are we all downplaying how much less power amd is using compared to nvidia?

Any power consumption savings are thrown out the window to me for nvidia while gaming if this is the case(for high refresh rate gamers. This sadly doesn't apply to me since I can't get a high refresh rate panel at 4k).

Yeah right? I always treated it as a normal thing to run at higher idle clocks as I was running 3x 1440p with Titan Xs, but seeing AMD handle 3x 1080p @144hz with low idle clocks I'd really like to see if the same holds true for AMD at 3x 1440p @ 144hz.
 
Feb 19, 2009
10,457
10
76
It's similar to multi-monitor setups, IIRC Kepler could not go into idle clocks unless its a single monitor.

These tend to be niche, 144hz is pretty niche as is multi-monitor.

It's an issue for sure if you're on such setups.

Edit: This is comedy gold though:

"I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future."

Yet in their own reviews in the past, they didn't hesitate to point out when AMD GPUs used SINGLE DIGIT Watts more in idle mode... but now 50-60W extra is only "interesting". lol
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
At 120hz, both seem to draw the same, but at 144hz/1440p, the Nvidia card does jump up a lot.

They failed to test the memory clocks on AMD's system when using 144hz. At least in the past, it wasn't their core clocks that increased, but their memory clocks. It seems Nvidia increases the core clocks instead.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
At 120hz, both seem to draw the same, but at 144hz/1440p, the Nvidia card does jump up a lot.

They failed to test the memory clocks on AMD's system when using 144hz. At least in the past, it wasn't their core clocks that increased, but their memory clocks. It seems Nvidia increases the core clocks instead.


I fail to see how that matters in the scope of things when the Fury is a FLAT LINE across power consumption. In case you missed it. It doesn't matter if the mem clocks are bumped up or not, consumption hardly changed. Or zomg a whopping .6w difference? Hold the press, we need a re-test!


powerdrawamd1.png
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I fail to see how that matters in the scope of things when the Fury is a FLAT LINE across power consumption. In case you missed it. It doesn't matter if the mem clocks are bumped up or not, consumption hardly changed. Or zomg a whopping .6w difference? Hold the press, we need a re-test!


powerdrawamd1.png

Like zomg, I just was curious. Maybe Fury X doesn't need to clock up the memory like in the past. Chill out.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Keep in mind this seems to be only at 1440p so far. For the 980ti anyway. If you are running 1080p 144hz it doesn't happen. I haven't seen anyone say what happens with 4K 60hz or weaker GPUs at 1080p 144hz.

searching google, this has been popping up here and there since 2014 at least.

http://www.overclock.net/t/1497172/...emperatures-and-power-draw-on-your-nvidia-gpu

https://www.reddit.com/r/nvidia/comments/38vonq/psa_nvidia_users_with_a_high_refresh_rate_monitor/

Seems earlier/weaker cards did it at 1080p and even at 120Hz

It matters the most for the 980Ti at 1440p.....
The 980Ti is the most LIKELY card to be used at high refresh rates at 1440p.


Yeah right? I always treated it as a normal thing to run at higher idle clocks as I was running 3x 1440p with Titan Xs, but seeing AMD handle 3x 1080p @144hz with low idle clocks I'd really like to see if the same holds true for AMD at 3x 1440p @ 144hz.

I thought more people would care especially considering how much people talk about high refresh rates and 1440p on here...
Guess the power consumption being a big deal isn't so big after all.
Honestly, this should be noticed earlier by major reviewers. Fans spinning up at the desktop phase should be NOTICED. It's actually annoying when I think about it. We have 1440p monitors tested all the time. NONE of these people noticed the 980Ti fans turning on when they should be idle at the desktop? It just makes me wonder what other flaws they overlook of the 980Ti.
I'd still get one though, card is fast. Too bad Nvidia doesn't love me enough to get Gsync in a big panel like AMD has with Freesync by making it easily accessible. My 4K 65 inch freesync panel dreams me be crashing down anyway and I may be stuck with some small 55 inch panel.
 
Feb 19, 2009
10,457
10
76
The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop.

NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road.

With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.

This is true for core clock but on AMD, vram clocks don't fully idle in multi-monitor situations. The core does idle just fine in my exp, including 4k 60hz.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
This is true for core clock but on AMD, vram clocks don't fully idle in multi-monitor situations. The core does idle just fine in my exp, including 4k 60hz.

Hmm, they idle fine for me at all refresh rates except 144hz, at which point it goes to 3D clocks. And as shown in the pcper article, it made .6 watt difference in the consumption. The control over the mem clock switch is somewhere in the powerplay tables I would guess. No matter though, it doesn't make any difference consumption wise.

Btw in the pic below it was runnig 144hz, then I switched it to 120hz. You can see the drop.

HXU68CL.jpg
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
From the PCPER write up:

NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road.

Doesn't sound like a driver update will do the job just like Fermi (mine on dual monitor), and apparently Kepler (didn't own one) had unresolvable idle quirks.

Given how important power efficiency is to many, on forums and for many GPU reviewers at least, I expect this will upset a lot of Maxwell 2 owners who have these high refresh monitors.

Also hope they check Pascal for this issue when it is reviewed as the Nvidia response as conveyed by PCPER doesn't really pinpoint which architecture down the road will have the fix.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The irony, save 30w here in gaming and lose 60w every idle moment 24-7.

I see about 60w difference at 144hz, and 4w difference at 120hz and below. I don't see the 30w difference.

The obvious solution is to use 120hz at the desktop and 144hz while gaming.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I see about 60w difference at 144hz, and 4w difference at 120hz and below. I don't see the 30w difference.

The obvious solution is to use 120hz at the desktop and 144hz while gaming.

The other is to go back to peasantry resolution...
Or get a proper graphics card from AMD that doesn't cut corners here and there.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Elevated idle clocks & RAM clocks at high resolutions/high refresh rates/multi monitor setups has always been a thing since those very low power states were introduced. Just search your favorite card from the recent past and add "multi monitor idle" to the search if you don't believe me. The more Pixels you push through your card, the more bandwith you need - in both 2d and 3d modes.
As for Maxwell vs. GCN1.2, that's not much of a surprise to me since Fiji has more bandwith at any power state than current Maxwell. I assume that this problem will be gone for both sides with HBM2.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
The obvious solution is to use 120hz at the desktop and 144hz while gaming.

1440P 144hz and high end hw to drive it in gaming is enthusiast territory, I am pretty much certain that every one of them either knows about this problem and runs 120hz desktop or does not care :)

Erenhardt said:
The other is to go back to peasantry resolution...
Or get a proper graphics card from AMD that doesn't cut corners here and there.

I see what you did here. Problem is that those "proper" cards currently don't go over 144hz, and it remains to be seen what will happen with power usage when/if AMD hacks in 165hz support. But let's ignore obvious solution of driving both vendor cards @120hz that has other benefits like dividing properly by 24hz, 30hz and 60hz and focus instead on shouting about sky falling in these forums.
 

Sunaiac

Member
Dec 17, 2014
123
172
116
The obvious solution is to use 120hz at the desktop and 144hz while gaming.

Funny that the obvious solution of putting the R9290X to -50% Power, loosing 5% perf and 100W was never mentionned in maxwell tests. Or using non reference 290Xs at reference clocks. Or any other thing taking 2 clicks in the drivers.

Nothing is a deal breaker when there's nVidia written on a graphic card, but God forbides AMD cards have to make users make 2 more clicks in the drivers !

The Press, objectively (c) nVidia commercials since 1999.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
This forum thread is interesting and somewhat predictable. The Nvidia vs AMD post, counterpost. I own AND like cards from both companies.

As an overclocker to some degree, I've learned no matter which company that at a certain point the power usage surges when you jump the OC.

I'm not trying to minimize the findings here about the GTX980TI coupled with this particular monitor.

Just an observation that whenever you get into OCing expect some jump in power usage.

What is interesting is that the Fury X regardless of 60Mhz, 120 or even 144 uses the same idle power while the 980TI jumps significantly at idle at 144Mhz.

A solid bullet point favoring the Fury X vs GTX980TI when running a 144mhz monitor at idle.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It's similar to multi-monitor setups, IIRC Kepler could not go into idle clocks unless its a single monitor.

Multimonitor have been a widespread AMD issue.

power_multimon.gif


Doesn't sound like a driver update will do the job just like Fermi (mine on dual monitor), and apparently Kepler (didn't own one) had unresolvable idle quirks.

Sounds like its simply a hardware bottleneck somewhere that requires the jump for 144hz. Just like previous AMD cards with multimonitor.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Multimonitor have been a widespread AMD issue.

power_multimon.gif

I'm just glad that they both do it better than they used to. Until recently I had an old Fermi card in my workstation, that thing was a furnace with two monitors attached. New Kepler card handles it much better.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
ShintaiDK, from your posting above, it appears that AMD's Fiji core has come a LONG way to leveling the playing field concerning usage of power on multi-monitor setups at idle.

I really like my GTX980TI. However, the Fiji core has really leveled the playing field in the high end. It will be VERY interesting to see how the release of Pascal changes that playing field.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
ShintaiDK, from your posting above, it appears that AMD's Fiji core has come a LONG way to leveling the playing field concerning usage of power on multi-monitor setups at idle.

I really like my GTX980TI. However, the Fiji core has really leveled the playing field in the high end. It will be VERY interesting to see how the release of Pascal changes that playing field.

I think it applies to Tonga as well. Aka all GCN 1.2. But any previous GCN is terrible with multimonitor.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
From the PCPER write up:



Doesn't sound like a driver update will do the job just like Fermi (mine on dual monitor), and apparently Kepler (didn't own one) had unresolvable idle quirks.

Given how important power efficiency is to many, on forums and for many GPU reviewers at least, I expect this will upset a lot of Maxwell 2 owners who have these high refresh monitors.

Also hope they check Pascal for this issue when it is reviewed as the Nvidia response as conveyed by PCPER doesn't really pinpoint which architecture down the road will have the fix.
You'd think it'd upset then but judging from responses in this thread the theme seems to be:.
"high power consumption in high refresh/multi monitor setups is normal" despite the fact amd has cards that can handle higher refresh rates without massive power jumps.

It'll be interesting to see next gen who comes out with the more power efficient architecture and whether people still care about power efficiency next gen if amd comes out ahead.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
So I'm supposed to believe that a 980ti system at idle draws 76 watts at both 60Hz and 120Hz, but at 144Hz the power draw doubles? That is so broken.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So I'm supposed to believe that a 980ti system at idle draws 76 watts at both 60Hz and 120Hz, but at 144Hz the power draw doubles? That is so broken.

Its the entire system consumption. So the card itself changes from something like 10W to 50-60W or so if you include PSU loss.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Glad I don't own a 144hz panel anymore. I've been thinking of buying again, but haven't found the panel for me (yet).

I didn't even know Nvidia had lower idle clocks than AMD, I thought they were the same. Puts more pennies in my piggy bank for my next upgrade!
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Glad I don't own a 144hz panel anymore. I've been thinking of buying again, but haven't found the panel for me (yet).

I didn't even know Nvidia had lower idle clocks than AMD, I thought they were the same. Puts more pennies in my piggy bank for my next upgrade!
I've wanted a high refresh rate monitor. The issue is there are no options for me. I know 1080p 120hz 4k 60hz panels are possible but haven't seen a monitor above 50 inches although seen HDTVs that have it. I was I'll have to settle for 4k 60hz only gaming for awhile unless Korean manufacturers have new Tvs in the works. Sucks that in a global market I can't figure out what options I have from overseas now and upcoming easily