[PCPER] Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It would be interesting if they could test a 120hz 4K screen. See if any of them can get low idle clocks. I am already sure NVidia couldn't.
 

yannigr

Member
Jun 29, 2014
28
3
0
For years now the press have let us down by not informing everyone.
I have to disagree here. Every time there is even a suspicion that something is wrong with AMD hardware, the press all over the planet comes forward with huge articles in their first pages, full of analysis, photos, tests, videos, interviews with people, technicians, actors, presidents, athletes, and a huge finger pointing at AMD.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I've wanted a high refresh rate monitor. The issue is there are no options for me. I know 1080p 120hz 4k 60hz panels are possible but haven't seen a monitor above 50 inches although seen HDTVs that have it. I was I'll have to settle for 4k 60hz only gaming for awhile unless Korean manufacturers have new Tvs in the works. Sucks that in a global market I can't figure out what options I have from overseas now and upcoming easily

I have some weird nervous tick or something, where I see tearing and it absolutely jerks my attention.

144hz was basically useless to me without v-sync. And using v-sync introduced a range of issues that made me question why I bought a 144hz monitor in the first place.

So, I went and got a 1440p/60 IPS monitor instead. And now I'm looking more at ultrawide@60 than 1440@120/144/165hz. 60 hz seems to be just fine for me and my playing habits. I'm not huge into FPS' anymore.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
@moonbogg

On the first chart in op, the 75% increase in idle system wattage(77w to 134w) from going from 120hz to 144hz.

Oh, thanks lol. I totally missed that. So that extra power usage, I guess that's like leaving an extra light on in the house or something. My dad always told me to turn the lights off if not being used, so now when I look at my monitor I'm just going to be like, "Sorry dad" but only in my head :cool:
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Is it a refresh rate problem or a pixel pushing problem? Like 1080p60 vs 1080p60x3?

They are still looking into it, the comments on PCPer's article give some insight but the verdict is still out.

Moonbogg, if you set your desktop to run at 120Hz instead of 144Hz in Nvidia Control Panel your cards should downclock as normal.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
There is I think. 120Hz 4K is getting more common with TVs for example.

Nope. HDMI 2.0 can't do it, DP 1.2 can't do it. TV internal refresh rate, maybe.

Now that DP 1.3 is out, it'll probably be a year till we see 4K (3840x2160) 120Hz+ panels.

More interesting is 34" 3440x1440 21:9 144Hz panels with DP 1.3. More useful, IMO.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
There is I think. 120Hz 4K is getting more common with TVs for example.
All of them are 60 Hz panels with interpolation afaik.


On a sidenote, never understood the "TV for PC monitor"-hype. You get panels with mediocre to downright terrible color reproduction, locked out of gsync/freesync and generally screens that are way too bright for non-gaming/browsing. Really doesn't seem like a good investment, despite the size.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Its a driver problem. Just like the reduced color gamut problem. The issue here is how tame are the users to let this pass for such a long period.

The driver detects EDID values from your monitor and probably has an internal value table where X pixel clock or Y front porch/back porch/ sync width = certain pstate from the GPU.

Certainly having 120 hz desktop and 144hz gaming is a serious issue if you alt tab often when you game in the games 144hz matters (people on FPS kind of do that often when they die and wait for respawn). So its a no-go. The only solution will be to push Nvidia to fix the issue as AMD did with their own problem reading EDID values and determining Pstates in the past. Workarounds =/= Fixes.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Funny that the obvious solution of putting the R9290X to -50% Power, loosing 5% perf and 100W was never mentionned in maxwell tests. Or using non reference 290Xs at reference clocks. Or any other thing taking 2 clicks in the drivers.

Nothing is a deal breaker when there's nVidia written on a graphic card, but God forbides AMD cards have to make users make 2 more clicks in the drivers !

The Press, objectively (c) nVidia commercials since 1999.

What is wrong with you, and those responding this way. You don't find solutions to problems when they come about? It's not like you don't have to make compromises with AMD cards as well. I've owned quite a lot of them over the years. I'm quite aware of their pit falls and strengths.

I didn't say this was a non issue, only that the obvious solution is to use 120hz at the desktop. Once in game it doesn't matter, as you want the high clock rate. But it is a relatively minor issue to me. Especially sense I don't have a 144hz monitor (I use a 120hz one). I also don't have a 980ti.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Its a driver problem. Just like the reduced color gamut problem. The issue here is how tame are the users to let this pass for such a long period.

The driver detects EDID values from your monitor and probably has an internal value table where X pixel clock or Y front porch/back porch/ sync width = certain pstate from the GPU.

Certainly having 120 hz desktop and 144hz gaming is a serious issue if you alt tab often when you game in the games 144hz matters (people on FPS kind of do that often when they die and wait for respawn). So its a no-go. The only solution will be to push Nvidia to fix the issue as AMD did with their own problem reading EDID values and determining Pstates in the past. Workarounds =/= Fixes.

I dont think its just a driver issue. Same with pre GCN 1.2 and multi monitors. It rather seems to be a hardware limitation. But I guess we can see in the near future.

Someone with a 980TI and a 1440P 144Hz could try to forcefully down clock and see if any problems arise.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
AMDs problem with multimonitor in the past might have been related to driving them over different connectors. You'd have to use HDMI, DP, DVI. together. Now we have cards with multiple DP so maybe this is why its better for both AMD and nvidia.

Maybe...

Since the increased power consumption seems to come from the Memory clocks going up for older AMD cards, has anyone forced a lower clock speed and seen adverse effects?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
AMDs problem with multimonitor in the past might have been related to driving them over different connectors. You'd have to use HDMI, DP, DVI. together. Now we have cards with multiple DP so maybe this is why its better for both AMD and nvidia.

Maybe...

Since the increased power consumption seems to come from the Memory clocks going up for older AMD cards, has anyone forced a lower clock speed and seen adverse effects?

In the past, the adverse effect was flickering and it wasn't due to multiple port types. My 6950's used to flicker badly with 2 monitors hooked up to DVI if I forced the memory clocks to idle. I mostly didn't care.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Since the increased power consumption seems to come from the Memory clocks going up for older AMD cards, has anyone forced a lower clock speed and seen adverse effects?

It gave flickering and artifacts.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Its a driver problem. Just like the reduced color gamut problem. The issue here is how tame are the users to let this pass for such a long period.

The driver detects EDID values from your monitor and probably has an internal value table where X pixel clock or Y front porch/back porch/ sync width = certain pstate from the GPU.

Given Nvidia's response to PCPER it doesn't seem like it's solely a driver issue.

"NVIDIA is aware of the complication, it appears that a fix won’t really be in order until an architectural shift is made down the road." - from article
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
AMDs problem with multimonitor in the past might have been related to driving them over different connectors. You'd have to use HDMI, DP, DVI. together. Now we have cards with multiple DP so maybe this is why its better for both AMD and nvidia.

Maybe...

Since the increased power consumption seems to come from the Memory clocks going up for older AMD cards, has anyone forced a lower clock speed and seen adverse effects?

It gave flickering and artifacts.


^^BS. On tahiti, it was the same as with hawaii. Both of those generations had zerocore just like fury. I've been using multi mon going back to cayman btw, but back to Azix's question, you can't force lowered mem clocks at 144hz. If someone's figured out a way to do it, I don't know about it. Regardless, even with raised mem clocks it made no difference to idle consumption. That said at 120hz mem clocks can drop to 150mhz w/o artifacts. Artifacts WERE NEVER a question obviously.

It was a guess by some as to why Nvidia forced high core clocks. And as pcper noted later in the article, it is a limitation of their design not necessarily to prevent artifacts.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
^^BS. On tahiti, it was the same as with hawaii. Both of those generations had zerocore just like fury. I've been using multi mon going back to cayman btw, but back to Azix's question, you can't force lowered mem clocks at 144hz. If someone's figured out a way to do it, I don't know about it. Regardless, even with raised mem clocks it made no difference to idle consumption. That said at 120hz mem clocks can drop to 150mhz w/o artifacts. Artifacts WERE NEVER a question obviously.

It was a guess by some as to why Nvidia forced high core clocks. And as pcper noted later in the article, it is a limitation of their design not necessarily to prevent artifacts.

Then you should tell sites they are wrong. Example:
https://www.techpowerup.com/reviews/AMD/R9_Nano/28.html

The 390X is exceptionally bad, but not a surprise due to higher clocked:
https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/28.html

13W single monitor, 89W multimonitor.
 
Last edited:

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
Then you should tell sites they are wrong. Example:
https://www.techpowerup.com/reviews/AMD/R9_Nano/28.html

The 390X is exceptionally bad, but not a surprise due to higher clocked:
https://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/28.html

13W single monitor, 89W multimonitor.


Did you read the test parameters? It's a specific setup that I don't use nor care about. Also, it's doubtful their setup was like how a real user would configure a multi-mon/eyefinity rig. A few clicks in AB and it clocks will drop like how I've shown repeatedly in this thread, W/O artifacts. Remember that?

Multi-monitor: Two monitors connected to the tested card, both using different display timings. Windows 7 Aero sitting at the desktop (1920x1080+1280x1024) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable. When using two identical monitors with same timings and resolution, power consumption will be lower. Our test represents the usage model of many productivity users, who have one big screen and a small monitor on the side.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Given Nvidia's response to PCPER it doesn't seem like it's solely a driver issue.

"NVIDIA is aware of the complication, it appears that a fix won’t really be in order until an architectural shift is made down the road." - from article

It's good to see that NV is pro-active about this and will address it in Pascal. As far as existing Maxwell architecture, what level of GPU and CPU is needed to drive modern games at 1440P @ 144Hz? I am going to say 980 SLI or even 980Ti SLI. A single 980Ti can't hit those FPS anyway. If someone is going to be using $900-1300 of cards, what's the fuss about 100-130W of extra power usage?

Interesting is that all these users who bought nVidia because it was more efficient and cooler and quieter, etc... never noticed the additional noise and heat.

Kinda like 980Ti reference was never considered a jet engine by the North American media while using > 50 dBA at load, but when 6970/7970/290X did that, they were labelled as "jet engines."

Same thing with Full RGB over HDMI. Never noticed that for a decade. That's the irony here. On this very forum people recommend(ed) 960 over 290 to save power while losing 60-70% of performance while my guess is 0 people will sell their Maxwell cards for Furies because of 100-130W of extra total system idle power usage under this use case scenario. But next generation if Pascal smashes Arctic Islands in perf/watt, once again power usage is going to be the most talked about metric. Meh, I personally never cared about power usage and won't for a long time. I still remember the media trying to use lower power usage on LCDs/LEDs as one of the major selling points against plasma TVs, while mostly ignoring the key IQ metrics where plasma wiped the floor with LCD/LED.

Anyway, good to see that NV is aware and the fix is in place with Pascal. At least the issue won't be ignored for a long time.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Its not about driving games at that refresh rate though. Its "idle" state. 1440p is a target for many with cards above and including 290. even some with 380 level cards. The issue is cost of a 144hz version of that and if its common. This also likely happens if you have 1080p 144hz + second monitor.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
If one were to force downclocking, would issues arise?

Perhaps some sort of fixed function hardware is used for display output, and 144 hz @ 1440P is beyond that spec, requiring the shaders to be used instead?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Its not about driving games at that refresh rate though. Its "idle" state. 1440p is a target for many with cards above and including 290. even some with 380 level cards. The issue is cost of a 144hz version of that and if its common. This also likely happens if you have 1080p 144hz + second monitor.

What he is saying is that unless you are using multiple 980ti's, you likely would be just as happy at 120hz at the desktop or even in games. Who needs 144hz at the desktop anyway?

It's a problem, but this isn't a big problem. Most people don't have this setup, and if they do, I doubt most care that much.