Maxwell Power Consumption from Tom's Hardware

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I think most efficiency gains from the architecture during max constant load on the GPU without throttling is eaten by relatively high clocks compared to GK110, because they need it to up the performance appreciably. What needs to be clarified is that this scenario doesn't happen during ordinary gaming loads so it's only slightly relevant for a mid-range chip which is not a compute chip, not with such an abysmal dp performance. Teslas will most likely only use big Maxwell. If it can get better battery life during gaming loads then it's a success from efficiency POV. The chip is much smaller to make it appreciably faster they really need those high clocks. At lower clocks and voltages it would draw less power than GK110.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Does anyone know if Tom's card was of the shelves, or did nv sent it?

Also:
perfwatt_2560.gif

very fishy...
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Does anyone know if Tom's card was of the shelves, or did nv sent it?

Also:

very fishy...

OC greatly decreases performance per watt due to very high clocks. Clockspeed determines power consumption more than everything on the GPU.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Does anyone know if Tom's card was of the shelves, or did nv sent it?

Also:
perfwatt_2560.gif

very fishy...
The GB one at TH was factory OCed, and at least one other review showed it running fairly hot (but not by that much). MSI and Asus seem to have stock mildly OCed cards that don't use substantially more power, but if you are marketing it for overclocking, that's likely not going to cut it (and MSI, at least, is likely to later release Lightning versions, that run hot in general).

http://www.guru3d.com/articles_pages/asus_geforce_gtx_970_strix_review,7.html

I suspect Palit just has it stock overvolted a bit, to manage their 10% OC.
 

know of fence

Senior member
May 28, 2009
555
2
71
Right. Tom's numbers are about double, while everyone else's are +0-20%. I think THG really needed to compare known cards on the new testing system, and work out some measuring and statistical kinks, before publishing a review with form conclusions like they did.

I suspect the difference in the cards is just binning, assuming Tom's speculated 970 is whack, and the stock OC GB card is a bit power hungry (since other reviews have different results with actual 970 cards). That, while some 970 (and 960?) GPUs are going to salvaged, many are likely to handle being fully enabled, but be a little on the hot side, so don't make the cut for the stated TDP fully enabled at full speeds.

AT has the 970 consuming 3 W more in idle than it's bigger/better 980. This is no fluke clearly, even with different cooler configurations you can maybe add or subtract 0.7 W for each fan. Also Ryan Smith says idle voltage of both cards is 0.856 V, so it's unlikely the GPU in this case.
Something is different about that card maybe VRs maybe VRAM. That could be it, when you compare the die Shots the 980 uses 8 nondiscript VRAM modules, while the 970 has just 4 presumably cheaper ones from Samsung.

67930.png
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
AT has the 970 consuming 3 W more in idle than it's bigger/better 980. This is no fluke clearly, even with different cooler configurations you can maybe add or subtract 0.7 W for each fan. Also Ryan Smith says idle voltage of both cards is 0.856 V, so it's unlikely the GPU in this case.
Something is different about that card maybe VRs maybe VRAM. That could be it, when you compare the die Shots the 980 uses 8 nondiscript VRAM modules, while the 970 has just 4 presumably cheaper ones from Samsung.

67930.png
That 970 has 8 modules. 4 on the front and another 4 on the back. It should be identical to the RAM on the 980 (7GHz Samsung).
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
AT has the 970 consuming 3 W more in idle than it's bigger/better 980. This is no fluke clearly, even with different cooler configurations you can maybe add or subtract 0.7 W for each fan. Also Ryan Smith says idle voltage of both cards is 0.856 V, so it's unlikely the GPU in this case.
Something is different about that card maybe VRs maybe VRAM. That could be it, when you compare the die Shots the 980 uses 8 nondiscript VRAM modules, while the 970 has just 4 presumably cheaper ones from Samsung.

67930.png

Those charts are inherently about +/- 2-3% due to chip to chip binning. Different models increases the difference.
 

know of fence

Senior member
May 28, 2009
555
2
71
That 970 has 8 modules. 4 on the front and another 4 on the back. It should be identical to the RAM on the 980 (7GHz Samsung).

You're right, the've put another 4 modules on the back possibly to cram more stuff on the smaller PCB. I guess we'll never learn why there is a difference, but there is a difference that much is certain.

Hardware.fr confirms Tom's data, except their Nvida reference card is the least power hungry. They also differentiate idle and idle screen off(veille écran).
They also measures consumption for the cards separately, and Hardware.fr have the Nvidia 980 Reference card at just 9 W, while the tripple fan cooler Gigabyte 980 G1 raises that to 12 W, and the 970 further raises it to 15 W.
Hardware.fr also has the 750 Ti at 7W (same as Toms), which is as low as Nvidia idle power can get.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Reading iPhone5 review something occurred to me, why Tegra K1 is still using Kepler SMXes rather than MAXWELL SMMes if the latter is so much more efficient? I think the key to gaming efficiecy of desktop Maxwell is something other than the changes in shader arrangement and that secret sauce is already incorporated in Tegra K1, that's why NV released Tegra K1 based on Kepler architecture rather than rushing Maxwell version to the market. If the GPU portion of Maxwell tegra ends up 30-35% more efficient than kepler counterpart rather than 100% then that will mean that the above hypothesis is correct. Somehow I find it hard to believe that they can be competitive with a part that is 100% less efficient then their newest tech.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Reading iPhone5 review something occurred to me, why Tegra K1 is still using Kepler SMXes rather than MAXWELL SMMes if the latter is so much more efficient? I think the key to gaming efficiecy of desktop Maxwell is something other than the changes in shader arrangement and that secret sauce is already incorporated in Tegra K1, that's why NV released Tegra K1 based on Kepler architecture rather than rushing Maxwell version to the market. If the GPU portion of Maxwell tegra ends up 30-35% more efficient than kepler counterpart rather than 100% then that will mean that the above hypothesis is correct. Somehow I find it hard to believe that they can be competitive with a part that is 100% less efficient then their newest tech.

K1 have been in development longer than Maxwell most likely. And to change it, you have to redo a lot and postpone everything. New validation etc.