- Apr 18, 2014
- 1,438
- 67
- 91
Just from what you posted,
Why does the fury top out at 142hz? It uses over 100% more power according to those graphs. Hmm
I think it's more reasonable to look at that the other way. That is a huge power savings, from exploiting lower clock speeds using when using lower refresh rates.
I always attributed the increased power draw/high idle clocks when not gaming @ 144Hz to some sort of bug in Nvidia's drivers.
If you have a 144Hz panel and a Ti, leaving the panel at 144Hz when not gaming/watching video/using hardware acceleration will cause the core to idle ~850MHz.
However, if the user sets their refresh rate to 120Hz while not gaming, the idle core clock falls way down to ~135MHz, undoubtedly saving on power.
Maybe it isn't a bug? Maybe it is? Not sure.
I always attributed the increased power draw/high idle clocks when not gaming @ 144Hz to some sort of bug in Nvidia's drivers.
If you have a 144Hz panel and a Ti, leaving the panel at 144Hz when not gaming/watching video/using hardware acceleration will cause the core to idle ~850MHz.
However, if the user sets their refresh rate to 120Hz while not gaming, the idle core clock falls way down to ~135MHz, undoubtedly saving on power.
Maybe it isn't a bug? Maybe it is? Not sure.
so if a Nvidia user has a 144 Hz monitor, then their GPU probably ends up drawing more power than an AMD card in the long run?
That only applies to 144hz and 1440p monitors. You can run 1080p monitor @ 144hz and still have idle clocks. Of course you have like 75% more pixels to push at 1440p so that has to be reason why the clocks increase.
One user at other forum tried to lower clocks with inspector and got green screen when running 1440p @ 144hz so something funky is going on.
Interesting if that is true. They should have done more testing for that article. On a weaker GPU would it be the same situation at 1080p? etc
Seems to be a total desktop pixel rate threshold to determine if idle clock is 150Mhz or 810Mhz. I see the same increase on 2 1440p monitors (@60hz stays at 150Mhz) to 3 1440p (@60hz goes to 810Mhz) or 2 1440p monitors(@120hz at 810Mhz). Two 1440p monitors (one at 120hz and other at 60hz) is at 150Mhz idle so would think one 1440p at 144hz would mean 150Mhz as well. If not a bug then another limitation to keep away artifacts\issues seen when idle clock is too low.
Interesting is that all these users who bought nVidia because it was more efficient and cooler and quieter, etc... never noticed the additional noise and heat.
I wonder if there will be any outrage across the webs? For years now the press have let us down by not informing everyone.
People who run 1440p @144hz isn't a vast majority of users so I'd assume most people using Nvidia are running at 150mhz @idle. I do like how AMD can run triple 1080p @ 144hz at pretty low clock rates (I'm hoping same holds true with AMD at 3x 1440p @ 144hz).
We've had the Swift around for years. None of the reviewers noticed/reported it? We've had people very animate that they wouldn't buy AMD because of power draw. Here we have huge power usage at idle, where the PC's spend a majority of their time! Surely they will be absolutely livid they've been deceived. No way nVidia wasn't aware.