Maxwell Power Consumption from Tom's Hardware

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Im gonna post the exact power consumption the two Maxwell cards have on each power connection.

You can`t get it any more exact than this. Whoever refuse to believe this are trolls. Period.

/thread

GTX 970
wAljBiY.png



GTX 980
piBAjfP.png
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It use 240W for sure, Nvidia driver detect Furmark and will indeed cap the power usage, look like THG get over this trick, if they had made such a gross measurement mistake this would also show in their other comsumption graphs.

Besides, did you pay attention to the throughput/watt in lite coin mining.??.

1.74KHs/W for the 970 , 1.925 for the 980 and 2 for the 290X, what happened to the 2x perf/watt improvement.?.Or was it in respect of the GT 680 wich score about 1KHs/W..?.

And what would happen in a task where its throughput would be maxed out and at 290X levels..?.

Edit : in the last graph the 980 consume less because it has reached its max temp, the GPU is surely throttling :

pic_disp.php


That was really using a bench out of context...

Litecoin is the same compared to the 680. Nvidia never compared 2x perf/W to any AMD product, not sure why you are bringing this into the conversation. And yes in regards to Litecoin, the 980 just about doubles perf/W over the 680.

Sure its throttling but that is how the card is designed. Any heavy compute workload is going to drop clocks to stay in TDP, the 680 also drops clocks. Possibly tom's card isn't dropping clocks on compute tasks.

The point is toms is way off from everyone else. The 285 W load they measure under furmark doesn't agree with any other site.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Im gonna post the exact power consumption the two Maxwell cards have on each power connection.

You can`t get it any more exact than this. Whoever refuse to believe this are trolls. Period.

What if the card is power capped with furmark detection as is usual with Nvidia.?.

Those power numbers are useless because it doesnt say what is the throughput of the card, it could be severly throttling and you would know nothing about it.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Looks like there needs to more discussion on the power aspects of the 970/980, not less. The answer going to become more clear through what?, just dropping it?


Folks trying to cap the conversation.... The real trolls IMO.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What if the card is power capped with furmark detection as is usual with Nvidia.?.

Those power numbers are useless because it doesnt say what is the throughput of the card, it could be severly throttling and you would know nothing about it.

That's the point. The card should be severely throttling (AT recorded 923 mhz). Toms's appears to not be doing anything to reduce power. Any compute workload should drop clockspeeds to keep power consumption in check, toms appears to be running full boost clocks under furmark and is not default behaviour.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu. They are better off keeping their current teslas than buying maxwell based ones thar bring nothing new to the table for them.

Would kill to see power figures with different cpus and gpu usage graphs in gaming.. Probably the crappier the cpu or game code, the better perf/watt shown by the gpu.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Looks like there needs to more discussion on the power aspects of the 970/980, not less. The answer going to become more clear through what?, just dropping it?


Folks trying to cap the conversation.... The real trolls IMO.
Oh did I post specific data that ruined this whole useless discussion? Did I take away all your fun of wasting time on twisting facts? Nah, I`m sure you guys will find a way anyway

"Nvidia cards throttling on Furmark"
Ah give me a break. When was Furmark even a simulation of something one would encounter on a real life scenario? Never.
Metro Last Light measurments are there. Tomb Raider too. Games.
Plus idling power consumption are there.

You don`t need anything other.
But of course I know this won`t sink in to any of your heads. So carry on. I`m out
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Sure its throttling but that is how the card is designed. Any heavy compute workload is going to drop clocks to stay in TDP, the 680 also drops clocks. Possibly tom's card isn't dropping clocks on compute tasks.

The point is toms is way off from everyone else. The 285 W load they measure under furmark doesn't agree with any other site.

I think that there s Furmark detection anyway and that the cards are voluntarly capped for reviews purpose, otherwise why such differences in numbers and why the reference design and the OEM card 130W difference in Furmark, this cant be explained by the 10% or so higher frequency of the OEM version, notice that the power usage in Unigine Heaven is comparable in both cards....

Gigabyte_GTX980_g1gaming_graphs_power.png


http://www.eteknix.com/gigabyte-g1-gaming-geforce-gtx-980-4gb-graphics-card-review/17/
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu. They are better off keeping their current teslas than buying maxwell based ones thar bring nothing new to the table for them.

Would kill to see power figures with different cpus and gpu usage graphs in gaming.. Probably the crappier the cpu or game code, the better perf/watt shown by the gpu.


The less the throughput the more the efficency, this work in games where throughput rarely aproach 100% but when it occurs the chip has no other solution than to drain the real TDP , that is 225-250W, the power management seems to work at 20-100KHz frequency, at those frequencies gating is extremely efficient as it s a cakewalk for devices that can switch at 1GHz, the control system has forcibly a lag but at 10-50us this is totaly transparent for the user and the algorithm has 10 000-50 000 cycles (of the GPU frequency) to compute the configuration.

Since voltage is used as an important parameter that is changed accordingly then it means that the GPU gating frequencies must be low enough for a PWM based power supply to accomodate the demanded voltage rates of changes, modern PWM PSUs such as in dGC generaly use higher frequencies than the ones i mentioned, hence they are fully adapted for such implementations.

Also other scheme to reduce power drain can be used, most notably reducing frequencies of parts of the chips, this couldnt be implemented using the usual PLLs of course, the reductions ratios would had to be integer values, the advantage is easyness of implementation, moreover with the scheme above since the reaction time is short enough to allow accurate estimation of the ratio that should be applied.


00-Power-Consumption-1-Millisecond.png
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
If you're building a gaming pc a quality psu should be the foundation of it. Period.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Looks like there needs to more discussion on the power aspects of the 970/980, not less. The answer going to become more clear through what?, just dropping it?


Folks trying to cap the conversation.... The real trolls IMO.

There are both. Folks REALLY trying to cap, and those REALLY trying to promote. Who's better? Who's worse?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I think that there s Furmark detection anyway and that the cards are voluntarly capped for reviews purpose, otherwise why such differences in numbers and why the reference design and the OEM card 130W difference in Furmark, this cant be explained by the 10% or so higher frequency of the OEM version, notice that the power usage in Unigine Heaven is comparable in both cards....

Gigabyte_GTX980_g1gaming_graphs_power.png


http://www.eteknix.com/gigabyte-g1-gaming-geforce-gtx-980-4gb-graphics-card-review/17/

It looks like the gigabyte card is made for overclocking and specifically designed for large amounts of power (2x 8 pin vs. 2x 6 pin). It also looks like the card is running those clocks on furmark which will chew a tremendous amount of power. Its factory overclocked with a 300W tdp.

I wouldn't read too much into it, according to AT for instance the ref 290x clocks down to 870 mhz under furmark. The 780 and the 780 TI do to.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
I don't think Tom's review takes anything away from the 970/980. Nvidia did an amazing job with Maxwell. Feature rich and excellent power efficiency. What the article does bring up though is:

1) Are the efficiency gains from Nvidia's excellent work on fine-tuning GPU Boost or in the Maxwell architecture?

2) Do the spikes in Maxwell power delivery act any different than past cards which might necessitate a higher rated PSU?

Personally, the answer to #1 is really just academic. I'm curious to know where the gains came from because it has implications to AMD and whether they can improve PowerTune similarly. Either way Nvidia hit one out of the park with efficiency gains. And #2 isn't really worth worrying about if you've got a quality PSU within Nvidia's wattage recommendations.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Oh did I post specific data that ruined this whole useless discussion? Did I take away all your fun of wasting time on twisting facts? Nah, I`m sure you guys will find a way anyway

"Nvidia cards throttling on Furmark"
Ah give me a break. When was Furmark even a simulation of something one would encounter on a real life scenario? Never.
Metro Last Light measurments are there. Tomb Raider too. Games.
Plus idling power consumption are there.

You don`t need anything other.
But of course I know this won`t sink in to any of your heads. So carry on. I`m out

No reason to get upset. There is clearly something afoot here and somewhere in the middle is the happy medium... :D
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
It looks like the gigabyte card is made for overclocking and specifically designed for large amounts of power (2x 8 pin vs. 2x 6 pin). It also looks like the card is running those clocks on furmark which will chew a tremendous amount of power. Its factory overclocked with a 300W tdp.

In all tests it run at about 10% higher frequency than the reference, they overclocked it but didnt use Furmark when doing so, they should had get 30-50W more than the reference, at worst.
 

jj109

Senior member
Dec 17, 2013
391
59
91
The lower the GPU loading the more the efficency, at high loading efficency converge to the values of the previous gen, i used 200W as the base TDP in the exemple below and indeed the 970/980 are not 150-180W TDP cards, these are genuine 250W TDP cards, the TDP claimed by Nvidia is the average power comsumption in games but push the cards in intensive computations and they will consume 240-280W.

I don't really buy this claim. I posted in the other thread, but if Maxwell's power efficiency is due to power gating during non-use, then using those transistors should result in more throughput.

Your argument depends on two factors:
1) In GPGPU, power gating isn't happening to the extent that it happens in gaming
2) The functional blocks that are kept on by GPGPU are not contributing to throughput

So what did Tom's run and what was the throughput?

Or are we just confusing the terms "low power" and "efficiency"?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
In all tests it run at about 10% higher frequency than the reference, they overclocked it but didnt use Furmark when doing so, they should had get 30-50W more than the reference, at worst.

Its massively higher. AT shows that furmark on the reference cooler is 923 mhz. If you are at least keeping to base/boost clocks on the factory overclocked card then you are looking at 1228 base or 1329 boost (33% or 44% respectively).

You will see as massive gain in power consumption at those clockspeeds.
 

ams23

Senior member
Feb 18, 2013
907
0
0
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu.

Well that is not true at all. Tom's never even showed GPGPU performance at all in their "Torture Test" GPGPU power consumption measurement, nor did they specify what the actual workload was, so there is no way to determine actual efficiency.

If you look at Ryan's compute measurements from Anandtech's review, you will see that GTX 980 handily outperforms GTX 780 Ti in most compute benchmarks (other than double precision throughput of course), while completely destroying GTX 680 too. According to what Ryan said at B3D forum, "None of my compute benchmarks break TDP containment according to NVIDIA's drivers. You can definitely light up enough CUDA cores and push the card into throttling itself, but it's not violating TDP as far as I can tell."

67746.png
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu. They are better off keeping their current teslas than buying maxwell based ones thar bring nothing new to the table for them.

Would kill to see power figures with different cpus and gpu usage graphs in gaming.. Probably the crappier the cpu or game code, the better perf/watt shown by the gpu.

Those running the K10 (GK104) will most likely not switch to a possible M10 (GM204) but if GM2#0 is good enough then K20/20X/40 (GK110) and even some K10 users may well switch to that. Gaming Kepler 1/24 rate, gaming Maxwell 1/32, GPGPU Kepler 1/3 so GM200 will either keep that or have it slightly lower. I believe I saw some GRID improvements mentioned about Maxwell.

It would be interesting to see how Atom/Kabini handle such a card, utter waste but still...
 

positivedoppler

Golden Member
Apr 30, 2012
1,148
256
136
I don't think Tom's review takes anything away from the 970/980. Nvidia did an amazing job with Maxwell. Feature rich and excellent power efficiency. What the article does bring up though is:

1) Are the efficiency gains from Nvidia's excellent work on fine-tuning GPU Boost or in the Maxwell architecture?

2) Do the spikes in Maxwell power delivery act any different than past cards which might necessitate a higher rated PSU?

Personally, the answer to #1 is really just academic. I'm curious to know where the gains came from because it has implications to AMD and whether they can improve PowerTune similarly. Either way Nvidia hit one out of the park with efficiency gains. And #2 isn't really worth worrying about if you've got a quality PSU within Nvidia's wattage recommendations.

I agree with your conclusion for number 1. I think Tom's round about way of doing the power testing was to arrive at the conclusion that it's probably mostly the Boost. I don't know why some fans here are being so defensive about this. It absolutely takes nothing away from Nvidia's accomplishment because it's a gaming video card.

The conclusion is important to some people though. Seems like if you plan on purchasing a 980 or 970 for compute purpose other than gaming, might want to consider a beefier power supply.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Toms is completely wrong on their claims. His assumptions are stated like they are fact. He claims that all maxwell benefits in performance comes from improved powertune? Software? What?

They dont understand electricity enough, ironically his own measurements on the oscilloscope prove that this is absolutely not the case. He just doesnt understand what he is measuring

I don't really buy this claim. I posted in the other thread, but if Maxwell's power efficiency is due to power gating during non-use, then using those transistors should result in more throughput.

Your argument depends on two factors:
1) In GPGPU, power gating isn't happening to the extent that it happens in gaming
2) The functional blocks that are kept on by GPGPU are not contributing to throughput

So what did Tom's run and what was the throughput?

Or are we just confusing the terms "low power" and "efficiency"?

It is absolutely unacceptable for Toms to be speaking on things they have little or no understanding of. It is not only irresponsible, it is totally a disservice.

They use an oscilloscope and have no idea what they are measuring. Why would they go through so much trouble using professional equipment without the background to understand what the results are telling them?

Its not even radical, its working exactly by the laws and principles of electricity. It will only take any laymen 2 minutes to understand if they are pointed in the right direction.

Lets start with talking about inrush current. Which can be more than hundred times the sustained load in some systems. A lot of effort can be put into reducing it, but its always gonna be there.
http://www.interpoint.com/product_documents/DC_DC_Converters_Inrush_Current.pdf

You will have a hard time measuring it with a meter but an oscilloscope can and will not only catch a glimpse, it will record it. The faster the scanning, the higher it records. This concept is easy to understand and is an issue that can be controlled but never avoided.

The truth is anytime a circuit is energized, the power surges in and levels off. Filling capacitors and you have inrush current. Just as power gating would result in an initial spike when they come back on. Its not something you can get rid of completely. So their are efforts in managing it best you can such as in this paper.

http://users.ece.cmu.edu/~djuan/TVLSI_20081121_final.pdf

Abstract- During the power mode transition, simultaneously turning on sleep transistors provides a sufficiently large surge current

Since the current flowing through sleep transistors is proportional to the total size of the turned-on sleep transistors, the number and the timing to turn on sleep transistors should be restricted to avoid the excessive surge current. Hence, the turn-on sequence of the sleep transistors, called the wake-up schedule, has become a major challenge to constrain the surge current in a power gating design

Toms has no clue. He declares that what he sees on the oscilliscope is some evidence of some advance powertune technique and that is laughable logic. Most likely he is seeing Maxwells superior gating technology in action. There is no way this is a result from software. Using an oscilloscope you can measure on a level that no meter will ever see. But if you do not understand what you are reading, you shouldnt be using the tool. Toms done a huge disservice to the community
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Its simple really. Tom's is not measuring power properly. Maxwell may be spiking to those amounts for extremely small amounts of time and toms is getting the max over a period instead of the average or something like that.

Fact of the matter is that the gpu power use under compute loads isn't the ~240W that they claim.

67755.png


Litecoin.png


pic_disp.php

The GPU manufacturers throttle Furmark. It can't be reliably used to measure power. The mining load actually shows AMD being more efficient.
 
Last edited:

Vinwiesel

Member
Jan 26, 2011
163
0
0
I don't see why Tom's made such a big deal out of instantaneous power draw. I'd be curious to see the same measurements done on a Sandy-Bridge or newer Intel processor. If modern PSU's can handle the CPU bouncing between <10 and >150w, why wouldn't they handle a GPU going from 100W to 250W?

I think the proof is in the current vs. voltage graph, which shows there are no severe spikes in the voltage, despite the wild fluctuation in current. Honestly, Tom's should have saved this for a power-supply review, rather than focusing so heavily on it for a GPU review.

Furthermore, on the next page it shows the 60 second chart of the 970 and 980 windforce, and the power fluctuates between virtually 0 and 350W. I have a hard time accepting these figures, since the card is under load, constantly in 3D, so when would it ever be so near 0W? This setup is not ready for published reviews.