Maxwell Power Consumption from Tom's Hardware

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,197
136
I know this was mentioned in the review thread, but THG does a great job with power consumption on video cards and I think it's definitely worth a specific thread. Lots of great data here that shows Maxwell power consumption over different loads and how they did and didn't bring down TDP on the new cards.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-11.html

Really worth a read, IMO, if you like much more "scientific" testing methods.
 

Hitman928

Diamond Member
Apr 15, 2012
5,372
8,197
136
Quite a while now, though it's been refined over time. I think it started when the 680 was released, but I don't remember for sure. It's the only site I go to now for video card power consumption.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I'll copy in my comments, I'm pretty impressed. I guess they refined it now since they call it the new power consumption test setup.

No, you ll need a high power PSU because power peaks reach close to 300W, good article from THG, that s not habitual, good analysis as how they get better "power efficency"...

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-11.html

Thanks for sharing that and props for Toms for starting a new power measurement standard. :thumbsup:

I'd be curious what the 290x consumption looks like in a similar measurement. I have a watt reader but it's so slow that I wouldn't have guessed the peaks are so high and often.

Pretty interesting..

Nvidia's newest architecture presents us with a whole new set of challenges for measuring power consumption. If the maximum of all four possible rails are to be measured exactly (to find out Maxwell’s power consumption reduction secrets), then a total of eight analog oscilloscope channels are needed. This is because voltage and current need to be recorded concurrently at each rail in real-time. If the voltages are measured and then used later, the result may be inaccurate. So, how did we solve this problem?

We enlisted the help of HAMEG (Rohde & Schwarz) to search for a solution with us. In the end, we had to use two oscilloscopes in parallel (a master-slave triggered setup), allowing us to accurately measure and record a total of eight voltages or currents at the same time with a temporal resolution down to the microsecond.


To illustrate, let’s take a look at how Maxwell behaves in the space of just 1 ms. Its power consumption jumps up and down repeatedly within this time frame, hitting a minimum of 100 W and a maximum of 290 W. Even though the average power consumption is only 176 W, the GPU draws almost 300 W when it's necessary. Above that, the GPU slows down.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Is it really necessary to measure instantaneous power every 10µs?

A 350W PSU could probably handle a spike of 1000+W, if it happened in 10µs. The current spike would only heat things up for that 10µs, so long as it dropped back down after the spike. Is there another reason to measure power consumption at such a high resolution?
 

Abwx

Lifer
Apr 2, 2011
11,038
3,674
136
I'll copy in my comments, I'm pretty impressed. I guess they refined it now since they call it the new power consumption test setup.

Re post also :

That s power management as implemented in CPUs, at low loads frequency is increased, on high loads TDP is capped but bursts exceding the TDP noticeably are used as a mean to rapidly process the stalling datas, i do the hypothesis that the total power comsumption for a given task will still be about the same as previous gen, less peaks but even less dips (due to frequency boost) that will more than counterbalance said bursts.


Here the principle is to reduce fluctuations as much as possible, THG comparison with a compressor is fully accurate, max power is more or less capped but minimum power is also capped..
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
Is it really necessary to measure instantaneous power every 10µs?

A 350W PSU could probably handle a spike of 1000+W, if it happened in 10µs. The current spike would only heat things up for that 10µs, so long as it dropped back down after the spike. Is there another reason to measure power consumption at such a high resolution?

Its going to push people to get higher quality PSUs. Power spikes now are bad for a PSU. The constant heating and then cooling would wear on the components a lot more than a constant current. A low end power supply or even a worn out one, seems far more likely to die under large spikes and drops. And as someone who had a PSU die and take out the mobo, I would be worried about a PSU again.
 

Abwx

Lifer
Apr 2, 2011
11,038
3,674
136
Is it really necessary to measure instantaneous power every 10µs?

A 350W PSU could probably handle a spike of 1000+W, if it happened in 10µs. The current spike would only heat things up for that 10µs, so long as it dropped back down after the spike. Is there another reason to measure power consumption at such a high resolution?

Duplicate this graph for every 1ms (actualy that looks to be rather 100us spikes, one order of magnitude error in your estimation..) up to 1 second and the duty rates will be the same but they have the graph for a longer period :

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html

Its going to push people to get higher quality PSUs. Power spikes now are bad for a PSU.

The psu reservoir capacitors quality is what makes the difference...
 
Last edited:
Feb 15, 2014
119
0
76
I remember them such detailed power analysis around the time the Radeon 290X came out. Might be before that, but I doubt it was when the GTX680 came out. I don't remember seeing it for the 780Ti review.

The 980 fluctuates from 50W - 250W with a few rare peaks at 300W. The voltage drop isn't too much, and remains in the ATX specs.
Compare it with the 290X and 7990 (R9 295X2),the latter had peaks from 300W - 600W during light-normal loads. GPU Boost is what probably causes these fluctuations, but a good PSU should be able to handle this.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Is it really necessary to measure instantaneous power every 10µs?

A 350W PSU could probably handle a spike of 1000+W, if it happened in 10µs. The current spike would only heat things up for that 10µs, so long as it dropped back down after the spike. Is there another reason to measure power consumption at such a high resolution?

Its going to push people to get higher quality PSUs. Power spikes now are bad for a PSU. The constant heating and then cooling would wear on the components a lot more than a constant current. A low end power supply or even a worn out one, seems far more likely to die under large spikes and drops. And as someone who had a PSU die and take out the mobo, I would be worried about a PSU again.

Exactly, the point is to aid in PSU selection. I don't care as much about which card and what specs (100w or 300w) but I do want to know that they can peak e.g. 50-100% higher since I want to overbuy in the PSU department. I think they are quantifying with actual data what has been observed (some PSUs die although rated at X watts).

Personally I would liken it to figuring out how to measure micro-stutter. We've heard about it and known it exists but someone finally figured out how to measure it and now with a tool it's easy to determine if it's occurring (since some people can't see it). Not that they are at all the same issue, they just have a great new(?) way to quantify it.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
I remember them such detailed power analysis around the time the Radeon 290X came out. Might be before that, but I doubt it was when the GTX680 came out. I don't remember seeing it for the 780Ti review.

The 980 fluctuates from 50W - 250W with a few rare peaks at 300W. The voltage drop isn't too much, and remains in the ATX specs.
Compare it with the 290X and 7990 (R9 295X2),the latter had peaks from 300W - 600W during light-normal loads. GPU Boost is what probably causes these fluctuations, but a good PSU should be able to handle this.

7990 is not comparable to the 295x2. The chips are binned and power consumption it a good bit less that CF 290x's. Im still running mine on a 750 watt PSU.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Is it really necessary to measure instantaneous power every 10µs?

A 350W PSU could probably handle a spike of 1000+W, if it happened in 10µs. The current spike would only heat things up for that 10µs, so long as it dropped back down after the spike. Is there another reason to measure power consumption at such a high resolution?

Would like more info on this. Do the spikes measured on these cards power draws present any particular issue for PSU choice when running these cards in mGPU setup?
 

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
Thinking back to the maximum versus average power consumption findings for gaming, one fact becomes abundantly clear: AMD’s issue is not absolute performance or the efficiency of its architecture, but rather that PowerTune technology can’t adjust the power consumption quickly or finely enough depending on the actual load. This is exactly where Nvidia scores most of its points with Maxwell.

Interesting. I wonder had AMD cared enough about it's rampant power draw they could have easily fine tune PowerTune evern further to achieve similar result.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
All of this is nonsense. Millisecond-level power spikes are filtered out by capacitors; that's basic electrical engineering. Unless you buy a power supply like this, you'll be fine. What matters is consistent power draw. Nvidia knows what they're doing here - do you think they want to blow out PSUs and get sued by end users and OEMs for false labeling?

IMO, it was irresponsible for Tom's Hardware to run those oscilloscope charts without adequate clarification. Spikes of a few milliseconds make no difference at all. You'd see the same kind of thing on a modern CPU if you tested it, or on your smartphone SoC. It's a major part of how power efficiency is achieved today.
 

Abwx

Lifer
Apr 2, 2011
11,038
3,674
136
Thinking back to the maximum versus average power consumption findings for gaming, one fact becomes abundantly clear: AMD’s issue is not absolute performance or the efficiency of its architecture, but rather that PowerTune technology can’t adjust the power consumption quickly or finely enough depending on the actual load. This is exactly where Nvidia scores most of its points with Maxwell.

Interesting. I wonder had AMD cared enough about it's rampant power draw they could have easily fine tune PowerTune evern further to achieve similar result.

The thing is that if you want to extract the max throughput efficency will go down the drain, in short the more the GPU is loaded the less efficient it becomes, in this respect Nvidia s 2 x perf/watt claim doesnt hold for heavy compute, and even in games efficency is not that obvious as the dips are shaved by the frequency boost so the average minimum is increased, to simplify you can have 170W max power but if your average minimum is 120W, that is compressed perf, you ll consume more than if the max is 200W and the average minimum at 50W.

GPUs heavily loaded, 980 is running at 923MHz in this bench :

67755.png


This say that power drain tests using games will be very scenes dependent with the 980.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
I'm not questioning why they measure what they measure, I'm more wondering why they need to do it at such a high resolution. Measuring every 10 microseconds seems like overkill, and doesn't really provide any better results than what they used to do. The previous latest review from them with detailed power consumption measurements measured at 2ms intervals, a resolution that is 200x less.

The cards clock down in Furmark because that is what Nvidia does in the drivers...
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Does this have any real-world effect on the ~65 watt savings over the 780 ti shown in the AT tests, aside from needing a PSU with decent caps?
 
Last edited:

realibrad

Lifer
Oct 18, 2013
12,337
898
126
I'm not questioning why they measure what they measure, I'm more wondering why they need to do it at such a high resolution. Measuring every 10 microseconds seems like overkill, and doesn't really provide any better results than what they used to do. The previous latest review from them with detailed power consumption measurements measured at 2ms intervals, a resolution that is 200x less.

The cards clock down in Furmark because that is what Nvidia does in the drivers...

I would say that the benefit is to show what is going on in the background. I see the importance going 2 ways. If this is shown not to harm a PSU, then Nvidia just did something great to reduce power. If the spikes turn out to put excess wear on a PSU then its something to consider when upgrading.
 

Abwx

Lifer
Apr 2, 2011
11,038
3,674
136
The previous latest review from them with detailed power consumption measurements measured at 2ms intervals, a resolution that is 200x less.

This is the only mean to make power measurements with GFX wich are complexe loads with high current variation rates, that is, big variations have periods lower than 1ms, but a ton of them is equivalent to eventualy 25% more power than what would be measured by an archaic set up.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
This is the only mean to make power measurements with GFX wich are complexe loads with high current variation rates, that is, big variations have periods lower than 1ms, but a ton of them is equivalent to eventualy 25% more power than what would be measured by an archaic set up.

The review was for the Powercolor Dual 290X, and they also used stats from their 295X2. Both were at 2ms intervals.

It'll be interesting if they go back and remeasure past cards at this high resolution to see if they missed higher spikes.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
So you still need a good PSU that can handle all this constant fluctuation in power consumption? Seems like NV only cared about the numbers this time IMO. The endgame for the consumer is still the same: you still need a high quality PSU to handle one of these GPUs. Its the Boost clock fiasco all over again.


And here was hoping the perf/watt gains were a real. GPGPU is going to be a bitch in this regard.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I've got no problem with them measuring down to the microsecond, but I'd like to see GPUs from previous generations measure the same way to establish if Maxwell's spikes are truly something to worry about. If the GTX 680 averaged ~250W, but spiked to 800W at the microsecond level, then we'd know we have nothing to worry about because the power supplies we've been using all these years are already equipped for such spikes.

If, on the other hand, these spikes are something new and unique to Maxwell then that is something altogether different. I suspect it's the former, not the latter. Prove me wrong, Tom!
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
So you still need a good PSU that can handle all this constant fluctuation in power consumption? Seems like NV only cared about the numbers this time IMO. The endgame for the consumer is still the same: you still need a high quality PSU to handle one of these GPUs. Its the Boost clock fiasco all over again.


And here was hoping the perf/watt gains were a real. GPGPU is going to be a bitch in this regard.

I'm still seeing a 44 - 70 watt drop in gaming power use compared to the 780 ti in Tom's chart for the 4 - 980 cards. That seems real to me.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-13.html
 

Abwx

Lifer
Apr 2, 2011
11,038
3,674
136
The review was for the Powercolor Dual 290X, and they also used stats from their 295X2. Both were at 2ms intervals.

It'll be interesting if they go back and remeasure past cards at this high resolution to see if they missed higher spikes.

Not sure that it would change the picture as previous GPUs could have slow regulations loops wich didnt forcibly require more than a few 100s us sampling rates, measurements are necessary to confirm it though.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Who cares if averaged over a period of several minutes while gaming the power draw is at or below the rated TDP? Or more precisely (for those who claim the TDP is surpassed) if for a relevant period (regarding energy transfer to the cooler) the average power draw is within spec? Where does it say that spikes are bad, especially if those spikes are extremely short?
These measurements are interesting from a technical point of view but irrelevant in the end. And - I assume - many people will quote this site to make Maxwell look bad and to cry that Nvidia cheats with their TDP. If those cards were advertised as GPGPU cards, I would be critical, but if under gaming loads (these things are called GeForce...) nothing happens, I couldn't care less.
 
Last edited: