Maxwell Power Consumption from Tom's Hardware

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu. They are better off keeping their current teslas than buying maxwell based ones thar bring nothing new to the table for them.

Would kill to see power figures with different cpus and gpu usage graphs in gaming.. Probably the crappier the cpu or game code, the better perf/watt shown by the gpu.
If they have an older card and need an upgrade, they're not going to lose anything by choosing Maxwell over Kepler. But, that's normal. Tesla to Fermi wasn't much of upgrade, if you had a GTX 260 or better, and Fermi to Kepler wasn't much of an upgrade, unless you went with a high-end Kepler. Nothing new under the sun, there.

I've been wishy washy over the 280X or 290 and 760 or 770, with a Fermi, still. The 970 Seems to have good enough idle power use, nice even frame times, and 4GB RAM, at typical 770 2GB cost. That's enough for me. I'll soon finally have a decent video card for 2014.

As to the measurements of Tom's, while kind of interesting, short-term power use is only useful for cable amperage needs. Wall power, and cooling needs, are going to be based on sizable fractions of a second, not mirco or milli, and software power controls have never been as fine-grained or efficient in practice as they allegedly should be able to be in theory. Even if accurate, what good are they for practical uses?
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
I don't see why Tom's made such a big deal out of instantaneous power draw.

Their point wasnt spikes for spikes, it is that the perf/watt improvement is restricted to a limited throughput range, we could say that power comsumption in function of throughput rise linearly for say a 780, for the 980 it will rise exponentialy and the two functions will converge at maximum throughput, that is, perf/watt improvement get closer to zero the closer you get to max throughput.

Start doing intensive computations with the card and it will consume the same as the 780/780ti in such applications.
 

jj109

Senior member
Dec 17, 2013
391
59
91
Start doing intensive computations with the card and it will consume the same as the 780/780ti in such applications.

And how much throughput are we getting out of Maxwell vs Kepler in that situation? If you ignore the throughput part, then you are just talking wattage.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
I don't see why Tom's made such a big deal out of instantaneous power draw. I'd be curious to see the same measurements done on a Sandy-Bridge or newer Intel processor. If modern PSU's can handle the CPU bouncing between <10 and >150w, why wouldn't they handle a GPU going from 100W to 250W?

I think the proof is in the current vs. voltage graph, which shows there are no severe spikes in the voltage, despite the wild fluctuation in current. Honestly, Tom's should have saved this for a power-supply review, rather than focusing so heavily on it for a GPU review.

Furthermore, on the next page it shows the 60 second chart of the 970 and 980 windforce, and the power fluctuates between virtually 0 and 350W. I have a hard time accepting these figures, since the card is under load, constantly in 3D, so when would it ever be so near 0W? This setup is not ready for published reviews.

My thoughts exactly. Sure the equipment looks nice (nice seen better *grin*) but its no use when one doesn't know when, where and how to use it.
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Their point wasnt spikes for spikes, it is that the perf/watt improvement is restricted to a limited throughput range, we could say that power comsumption in function of throughput rise linearly for say a 780, for the 980 it will rise exponentialy and the two functions will converge at maximum throughput, that is, perf/watt improvement get closer to zero the closer you get to max throughput.

Start doing intensive computations with the card and it will consume the same as the 780/780ti in such applications.

You really won't let this one go will ya?

Just wait til the Tesla versions of maxwell cards come out and then decide whether or not your conclusion that your so eager to preach is right or not.

They could perhaps be a few months away.
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
You really won't let this one go will ya?

Just wait til the Tesla versions of maxwell cards come out and then decide whether or not your conclusion that your so eager to preach is right or not.

They could perhaps be a few months away.

If one buy such a card and do intensive computation it will consume much more than what is allowed by the power connectors because Nvidia specified them for the official TDP, think about the consequences, it wasnt too long that some other company was bashed for their cooler design, so now i should be silent because, well, it s not AMD that is under the microscope..??.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
If one buy such a card and do intensive computation it will consume much more than what is allowed by the power connectors because Nvidia specified them for the official TDP, think about the consequences, it wasnt too long that some other company was bashed for their cooler design, so now i should be silent because, well, it s not AMD that is under the microscope..??.

Sorry for my ignorance but what intensive computation are you referring to?

And don't worry about the wire, the connectors etc. They can handle quite alot of current than you think ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
If one buy such a card and do intensive computation it will consume much more than what is allowed by the power connectors because Nvidia specified them for the official TDP, think about the consequences, it wasnt too long that some other company was bashed for their cooler design, so now i should be silent because, well, it s not AMD that is under the microscope..??.

Well, a point could be that you were silent then and not now. Is that statement incorrect?
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
If one buy such a card and do intensive computation it will consume much more than what is allowed by the power connectors because Nvidia specified them for the official TDP, think about the consequences, it wasnt too long that some other company was bashed for their cooler design, so now i should be silent because, well, it s not AMD that is under the microscope..??.

The current over the wires over the very short time periods that Toms is measuring are because the output caps of the PSU and the input caps of the GPU are connected by a very very low resistance wire. If you are complaining about picosecond power surges between two capacitor banks over 16 or 18 AWG wire, then you are just showing that you don't know how capacitors work.

The mosfets, which are actually the parts that are subject to blow, aren't being measured. Do you know how an 12-phase CPU VRMS circuit works? Each phase is run way above its rated power for a fraction of a millisecond and then turns off at which point the next VRM turns on. Running components like this is not a problem because it takes the heatsinks many milliseconds to reach heat saturation and heat death is the most common failure mode.

The card average power over a second or so is actually what matters unless it's peak is basically a short which is not the case here.

The crazy thing about Toms picosecond measurements is that the larger the output caps on the PSU, the lower ESR those caps have and the lower the Hague of the PSU wires, the higher the instantaneous current will be. The peak current looks worse the better the PSU is.
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
The current over the wires over the very short time periods that Toms is measuring are because the output caps of the PSU and the input caps of the GPU are connected by a very very low resistance wire. If you are complaining about picosecond power surges between two capacitor banks over 16 or 18 AWG wire, then you are just showing that you don't know how capacitors work.

The mosfets, which are actually the parts that are subject to blow, aren't being measured. Do you know how an 12-phase CPU VRMS circuit works? Each phase is run way above its rated power for a fraction of a millisecond and then turns off at which point the next VRM turns on. Running components like this is not a problem because it takes the heatsinks many milliseconds to reach heat saturation and heat death is the most common failure mode.

The card average power over a second or so is actually what matters unless it's peak is basically a short which is not the case here.

The crazy thing about Toms picosecond measurements is that the larger the output caps on the PSU, the lower ESR those caps have and the lower the Hague of the PSU wires, the higher the instantaneous current will be. The peak current looks worse the better the PSU is.

You failed to understand that in Furmark the current is sustained at values such that the card dissipated power reach about 250W values, the current will be held about constant at 20A or so, any intensive computation performed on this card will produce this result, as explained in their words by THG the power comsumption reduction apply only if you dont use all the throughput.

On games max throughput is reached only during periods that are in micro seconds, hence the micro second spikes, but increase the data flow and the spikes will increase to the point that there will be no more spikes but a constant current at 20A.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
You failed to understand that in Furmark the current is sustained at values such that the card dissipated power reach about 250W values, the current will be held about constant at 20A or so, any intensive computation performed on this card will produce this result, as explained in their words by THG the power comsumption reduction apply only if you dont use all the throughput.

On games max throughput is reached only during periods that are in micro seconds, hence the micro second spikes, but increase the data flow and the spikes will increase to the point that there will be no more spikes but a constant current at 20A.

What about gaming? How does it do there according to your findings? I gotta say Abwx, this is really strange.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What about gaming? How does it do there according to your findings?

That's obvious, but not applicable to the conversation. It possibly looks like Maxwell's efficiency is software driven. That's fine. It doesn't really matter how they do it. What makes it interesting is that if it's not the architecture that's increasing efficiency then it's something that AMD might be able to duplicate with GCN.
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
What about gaming? How does it do there according to your findings? I gotta say Abwx, this is really strange.

I said that in games, wich are throughput limited by nature, there will be noticeably less power consumption than previous gen, the thing is that the claimed TDP apply only to games, and in this respect Nvidia is somewhat not telling the whole story as their merketing campaign imply that s it s a row 2 x perf/watt improvement that apply to everything while it s only the average improvement in games, in intensive computing there is zero improvement.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I said that in games, wich are throughput limited by nature, there will be noticeably less power consumption than previous gen, the thing is that the claimed TDP apply only to games, and in this respect Nvidia is somewhat not telling the whole story as their merketing campaign imply that s it s a row 2 x perf/watt improvement that apply to everything while it s only the average improvement in games, in intensive computing there is zero improvement.

Even if what you're saying is accurate, and I'm not saying that it is by any stretch of the imagination, why would this be important for GPUs made primarily for gaming? Why are you focusing so strongly on this really strange aspect of a gaming card? Can you just tell us what your ultimate point is?
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
Even if what you're saying is accurate, and I'm not saying that it is by any stretch of the imagination, why would this be important for GPUs made primarily for gaming? Why are you focusing so strongly on this really strange aspect of a gaming card? Can you just tell us what your ultimate point is?

There are people who will buy it for other purposes, i think that it cant be negated, so far Nvidia willingly stayed silent on this matter, they refrained from stating that it wouldnt suit computing usages, as such they are deliberatly misleading a part of the public in believing that it will have 2 x perf/watt of previous gen, THG has made a good work in this respect by doing a more in depth analysis that show that Nvidia claims are not true for all users.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Props for THG! Very brave of them to use the new equipment.
And to actually dare publishing results that are miles away from everyone else.

It will be an even better article once they divulge the 3d game and GPGPU app that they used

And they already have the quantum theory about Maxwell's efficiency based on their off-the-charts results?
If that's not brave, I dunno...
 
Feb 19, 2009
10,457
10
76
The HPC crowd won't like how maxwell improves close to zero in perf/watt vs kepler at gpgpu. They are better off keeping their current teslas than buying maxwell based ones thar bring nothing new to the table for them.

Would kill to see power figures with different cpus and gpu usage graphs in gaming.. Probably the crappier the cpu or game code, the better perf/watt shown by the gpu.

To be fair, this is a mid-range maxwell clearly aimed at gaming, so to say it has no improvement in perf/w for HPC work isn't a revelation. For that, we have to see big maxwell in action.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It looks like Furmark provide a score wich can be used to compute the perf/Watt, check the 750ti perf/watt when running this bench :

http://www.phoronix.com/scan.php?page=article&item=nvidia_maxwell_750linux&num=10

Phoronix power efficiency tests are a complete joke and utter crap when comparing solely the GPU. There is too much system power involved and results favour high performance high power GPUs over smaller more efficient units.

embed.php


While phoronix gives system perf/W this is most definitely not indicative of the actually efficiency of the silicon alone.

i7 is not 3x more efficient then the i3s.

embed.php
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
Phoronix power efficiency tests are a complete joke and utter crap when comparing solely the GPU. There is too much system power involved and results favour high performance high power GPUs over smaller more efficient units.

embed.php


While phoronix gives system perf/W this is most definitely not indicative of the actually efficiency of the silicon alone.

There are two other cards for comparison, the gt650 and the r7 260x, theses are relatively small GPUs that should not benefit from increased throughput, particularly the 260x.

Furmark is an open GL bench wich render an image, so it has some validity.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
That's obvious, but not applicable to the conversation. It possibly looks like Maxwell's efficiency is software driven. That's fine. It doesn't really matter how they do it. What makes it interesting is that if it's not the architecture that's increasing efficiency then it's something that AMD might be able to duplicate with GCN.

But its not. Its happening so fast there is no way.

Please take the time to read my previous post.