[PCPER] Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Azix

Golden Member
Apr 18, 2014
1,438
67
91
That seems to be another issue. the 144hz 1440p power requirements are higher, but the 165Hz requirements are even higher. Since they are talking about g-sync and very high refresh rates and this happens on monitors without it, one should assume the consumption will drop to 144hz no g-sync levels
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Glad I don't own a 144hz panel anymore. I've been thinking of buying again, but haven't found the panel for me (yet).

You know, you can run 144hz at 120hz and not notice any difference in motion fluidity. 144hz is really not far from a gimmick.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You know, you can run 144hz at 120hz and not notice any difference in motion fluidity. 144hz is really not far from a gimmick.

IMO keep pushing the tech and the quality. Hopefully, it spills into mainstream market then, and HDTV panels become native 200hz.
Then maybe broadcasts are 100hz.
Imagine seeing 4K 100hz NFL Super Bowl?
Wouldn't be bad right?

Doesn't hurt for manufacturers to keep competing against each other in refresh rates. They can't do it with resolution easily, but refresh rates works well. I'm interested to see where it will lead. The less barriers to our experience, the better....
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
You know, you can run 144hz at 120hz and not notice any difference in motion fluidity. 144hz is really not far from a gimmick.

I'd call it a marginal bonus, not a gimmick. A little smoother; a little faster at drawing the frame.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I love how it was an nvidia "shill site" that exposed this issue.

Oh what fun
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
I love how it was an nvidia "shill site" that exposed this issue.

Oh what fun

This issue has been known for over 2 years and acknowledged by NVidia board partners. There was already an acceptable workaround as well. This site exposed nothing.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
First I heard of it.

When the graphs got posted, it was completely new to me.

Looking at the article and the forum threads which were created referencing it, it is pretty clear that a lot of people new nothing about it.

Nice try though
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
This issue has been known for over 2 years and acknowledged by NVidia board partners. There was already an acceptable workaround as well. This site exposed nothing.
Yep. PcPer is clueless. Long standing issue that affects all 144hz+ 1440p monitors, not just the PG279Q. PcPer seemed to find about it only now and think they stumbled upon a juicy new story.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
This issue has been known for over 2 years and acknowledged by NVidia board partners. There was already an acceptable workaround as well. This site exposed nothing.

But they haven't bothered fixing it until now. So there is some value to the article. I also didn't know about it. I know the cards drew much more power with multiple monitors, but it didn't know about 144hz 1440p
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This issue has been known for over 2 years and acknowledged by NVidia board partners. There was already an acceptable workaround as well. This site exposed nothing.
Funny how I'm here every day and never heard a word. Read reviews, etc.. and never heard a word. Never saw a single thread. By well known you mean by nVidia and it's board partners?

First I heard of it.

When the graphs got posted, it was completely new to me.

Looking at the article and the forum threads which were created referencing it, it is pretty clear that a lot of people new nothing about it.

Nice try though

You tend to "prefer" nVidia, but good to see the honesty on your part.

Yep. PcPer is clueless. Long standing issue that affects all 144hz+ 1440p monitors, not just the PG279Q. PcPer seemed to find about it only now and think they stumbled upon a juicy new story.

Nothing to see here. Move along. Nobody is denying it's long standing. Just that it's been covered up. Nothing You or Pariah have said change that.


If you guys have known it for two years why stay silent? Why not someone post here for other nVidia owners to use the "workaround"?

Why no reports or graphs from any of our wonderful review sites? Incompetence? Cover up?
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
If you guys have known it for two years why stay silent? Why not someone post here for other nVidia owners to use the "workaround"?

Why no reports or graphs from any of our wonderful review sites? Incompetence? Cover up?
I dont have a 144hz+ 1440p monitor (rather a 120hz 1440p). I guess the vast majority of Nvidia owners do not either and I doubt those that do even break 1%. So the vast majority of Nv owners do not have this problem. And those that do, with a little digging around, can find a very easy workaround. I think its laughable for anyone to be upset that its not headline news at all the major tech sites and therefore there must be some sort of 'coverup'. :D
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I love how it was an nvidia "shill site" that exposed this issue.

Oh what fun

I don't think any site is a shill site. I think some sites though do have a bias a the time of writing, or don't write objective pieces well, vs other sites.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I dont have a 144hz+ 1440p monitor (rather a 120hz 1440p). I guess the vast majority of Nvidia owners do not either and I doubt those that do even break 1%. So the vast majority of Nv owners do not have this problem. And those that do, with a little digging around, can find a very easy workaround. I think its laughable for anyone to be upset that its not headline news at all the major tech sites and therefore there must be some sort of 'coverup'. :D


There is no work around afaik if you have multiple monitors, just saying.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I dont have a 144hz+ 1440p monitor (rather a 120hz 1440p). I guess the vast majority of Nvidia owners do not either and I doubt those that do even break 1%. So the vast majority of Nv owners do not have this problem. And those that do, with a little digging around, can find a very easy workaround. I think its laughable for anyone to be upset that its not headline news at all the major tech sites and therefore there must be some sort of 'coverup'. :D

Then it's incompetence? They all test 144Hz gaming monitors, yet none of them thought they should report that the card was running 3D clocks on desktop and let people know how to fix it? Thus avoiding not only the additional electrical usage, but also the noise and heat. That seems acceptable to you? It doesn't to me. Because I've seen far more trivial things beaten to death by review sites.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I don't think any site is a shill site. I think some sites though do have a bias a the time of writing, or don't write objective pieces well, vs other sites.

You are much more trusting than I am then. I think they take care of the IHV's and AIB's who they get the most incentives from. Some companies send out a lot more samples and have more free events that include travel etc... Don't want to get cut off from that.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Then it's incompetence? They all test 144Hz gaming monitors, yet none of them thought they should report that the card was running 3D clocks on desktop and let people know how to fix it? Thus avoiding not only the additional electrical usage, but also the noise and heat. That seems acceptable to you? It doesn't to me. Because I've seen far more trivial things beaten to death by review sites.

This is far more likely an issue of not testing something they didn't know needed to be tested. Did you know this needed to be tested? Did it ever cross your mind? I never see them testing 3 monitor power usage either.

PCPER stumbled onto it and wrote an article. Now people are aware, and it may be tested in the future as a result. If you call this incompetence, then every site is incompetent.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This is far more likely an issue of not testing something they didn't know needed to be tested. Did you know this needed to be tested? Did it ever cross your mind? I never see them testing 3 monitor power usage either.

PCPER stumbled onto it and wrote an article. Now people are aware, and it may be tested in the future as a result. If you call this incompetence, then every site is incompetent.

Well, I have no problem telling the difference in the fan speed on my card when it's running @ 3D clocks. None at all... Ever! It's in a case by the floor ~1Mtr from my head. Not on an open bench at ear level like most reviewers. Seems to me that there is no chance at all they didn't notice it.

I didn't call it incompetence. I listed either incompetence or cover up. You think it's, what... They couldn't tell? Please! There's no way not to notice the cooler ramping up for 3D clocks.
 

Osjur

Member
Sep 21, 2013
92
19
81
I think its laughable for anyone to be upset that its not headline news at all the major tech sites and therefore there must be some sort of 'coverup'. :D

But that means AMD has also been covering this for a long time now. My old 7970Ghz was running at 500/1500 clocks @ 144 and 120hz and 290X was running at 300/1250 which still increased idle power consumption quite a bit. Older gpu's ran at closer to full 3d clocks from both sides.

After changing to Fury X, my idle power consumption dropped about 60-70w vs 290X because it can run my 144hz screen and my 4960x1600p PLP-Eyefinity setup with idle clocks. This is actually the first GPU ever which is running with idle clocks on my system because I have been using exotic monitor setups.

I'm more surprised that this is making the headlines now, because the problem has been there for god knows how long and only 3 consumer cards atm can take just about any kind of monitor configuration without getting higher idle power consumption.

I actually asked Wizzard (TPU) at some point why he doesn't test eyefinity / surround monitor power consumption or higher hz monitors and also told him that it makes a big difference compared to his 1080 + 1024 "multimonitor" testing but he never answered me back :/
 
Last edited:

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
There is no work around afaik if you have multiple monitors, just saying.

AMD has the same problem with multiple monitors. No work around. No sites are covering this either. Must be a cover up...

Well, I have no problem telling the difference in the fan speed on my card when it's running @ 3D clocks. None at all... Ever! It's in a case by the floor ~1Mtr from my head. Not on an open bench at ear level like most reviewers. Seems to me that there is no chance at all they didn't notice it.

I didn't call it incompetence. I listed either incompetence or cover up. You think it's, what... They couldn't tell? Please! There's no way not to notice the cooler ramping up for 3D clocks.

I already posted the actual results of my card and anyone who says they can tell the difference in a 50RPM increase in a closed case with no problem ever is a liar. Not every video card has crap ergonomics like whatever you own.

This thread now has about 150 responses and over 7300 views, and only ONE poster has indicated they have a system capable of demonstrating this behavior along with mine that can reproduce it under different circumstances that affects AMD hardware as well. There has not yet been one poster who has a system capable of this that has indicated they care in the slightest.

In fact, the only people who seem to care about this are the resident AMD trolls here. Rather ironic considering this behavior also affects all but 3 AMD cards which none of you actually own.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well, I have no problem telling the difference in the fan speed on my card when it's running @ 3D clocks. None at all... Ever! It's in a case by the floor ~1Mtr from my head. Not on an open bench at ear level like most reviewers. Seems to me that there is no chance at all they didn't notice it.

I didn't call it incompetence. I listed either incompetence or cover up. You think it's, what... They couldn't tell? Please! There's no way not to notice the cooler ramping up for 3D clocks.

I think it was more of a case of not thinking about testing it. It probably didn't cross their mind. They have a list of objectives that they look for. While testing a card, they go through the check list until it is done. They very well might not have noticed, because unlike you, they have a host of things on their mind as they go through the process.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
AMD has the same problem with multiple monitors. No work around. No sites are covering this either. Must be a cover up...


Really? Same problem? My setup only goes to 3D memory clocks at 144hz.

Hmm, they idle fine for me at all refresh rates except 144hz, at which point it goes to 3D clocks. And as shown in the pcper article, it made .6 watt difference in the consumption. The control over the mem clock switch is somewhere in the powerplay tables I would guess. No matter though, it doesn't make any difference consumption wise.

Btw in the pic below it was runnig 144hz, then I switched it to 120hz. You can see the drop.

HXU68CL.jpg