WCCftech: Memory allocation problem with GTX 970 [UPDATE] PCPer: NVidia response

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Redentor

Member
Apr 2, 2005
97
14
71
The 970 still has 4GB of memory. So advertising it as a 4GB card isn't untruthful. It's the ROPs and L2 cache numbers that need to change to match what the card actually has.

GPU Bus, too. 256 =/= 224+32.

Like CPU with 4 core @ 3 GHz =/= CPU with 1 core @ 12 GHz
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
Yeah its strange that people aren't gushing over the 290, at least in NA where electricity is cheap.

A lot of people jumped on the "290/X is hot and loud, LOLZ!!!1!!11!one!!" bandwagon and won't re-evaluate that stance.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
A lot of people jumped on the "290/X is hot and loud, LOLZ!!!1!!11!one!!" bandwagon and won't re-evaluate that stance.

Loud appears to be taken care of with the better aftermarket coolers, but heat still seems to be a concern. Why else would seemingly a record number of people be modding their 290/290x cards with water cooling? It's not all about more overclock headroom, it's about maintaining GPU and VRM temps low enough so that they are at peak efficiency. Those cards, perhaps more than others but I'm not sure, lose their performance at higher temperatures, and not just when it's high enough to throttle although that is a major issue too when it happens.

And I'd love to reevaluate, I've been attempting to do just that for the past week, ha. I can find cards, like the 290X Tri-X which seem to perform wonderfully, better than seemingly all cards on the market from any competitor (at GPU temp and fan noise), but I cannot find how much extra heat is pumped into the case. It's more efficient at shedding the heat from the card, but that heat goes somewhere.
 
Feb 19, 2009
10,457
10
76
Loud appears to be taken care of with the better aftermarket coolers, but heat still seems to be a concern. Why else would seemingly a record number of people be modding their 290/290x cards with water cooling? It's not all about more overclock headroom, it's about maintaining GPU and VRM temps low enough so that they are at peak efficiency. Those cards, perhaps more than others but I'm not sure, lose their performance at higher temperatures, and not just when it's high enough to throttle although that is a major issue too when it happens.

Plenty of custom models run just fine, cool and quiet too. Heck, the popular & cheap TriX R290/X runs cooler & quieter than many cards, including NV.

Modders do it because its our passion. I also water modded a GTX 670...
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Plenty of custom models run just fine, cool and quiet too. Heck, the popular & cheap TriX R290/X runs cooler & quieter than many cards, including NV.
Correct, but it's the hotter models that spoil the big picture. It's them that carve bad image in peoples brain.

You know why is that?

'Cause, the standards are low / non-existant. They are not proud of what they do, it's not the card they would keep for themselves. Just also-ran. Pretty much. It's amazing how AMD has been losing the market with better hardware. Heads must start rolling. AMD needs a nazi-style leader. For a change.
 
Last edited:

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Plenty of custom models run just fine, cool and quiet too. Heck, the popular & cheap TriX R290/X runs cooler & quieter than many cards, including NV.

Modders do it because its our passion. I also water modded a GTX 670...

Well the point is, I'm trying to figure out the physics of the situation. So these cards are kept cool enough with improved aftermarket cooler designs. What is happening with the power fed into the thing? That's a serious load of wattage, and as we know, no chip is THAT efficient with power. On a scale that relates power to heat, that means these cards must create more waste-heat than cards that handle lower input power in the first place.
So, these coolers are incredibly efficient and help shed the heat from the heatsinks.
Where my lack of physics knowledge is impeding me: does a card that uses more wattage, and has a cooler keeping that GPU at 60ºC, shed more waste heat from the heatsink than a card that uses comparably less wattage but has a GPU kept at, say, 70ºC by it's cooler? I'm assuming that even though the GPU is cooler, the heatsink is simply taking more heat and more effectively removing it with the fans, and distributing more warmed air into the case.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
The stock 290/x burned such an image in many people's minds with its scorching 95°C operating temperature and noise, that still can't be erased having such cards as the Tri-X and the like out there... which perform better due to no throttling and consume less than the stock cards because of keeping the GPU at 70°C instead of 95°C. Power consumption difference that has been translated into even more performance with a little factory overclock. Improved in every way, no difference to some. Interestingly, the same happened with the GTX 480 in both temperatures/noise/power consumption and the custom models if there were any (I think there weren't, only aftermarket coolers for the DIY crowd), and the result wasn't nowhere as pronounced as on AMD's case. Double standards anyone?

AMD'd better not make the same mistake with the 380x... if it takes the same route as the 295x2 out of the factory as it's rumored, then it'll be a much different outcome.


-----------------

I'm interested in an answer to Destrektor's question. I'd like to better understand the physics of the matter, too, but my uneducated guess would be that the temperature at which the card is operating is a measure of how fast the heat it is producing, is being pulled away. In that way, the custom 290/Xs out there are far more efficient than the stock blower of hell, but I'd guess if you put the 290Xs blower on a 980 it wouldn't have to work that hard (I mean, spin the fan that fast) to remove its heat because it is producing less. But how much waste heat is being expelled from the card, is directly related to how much heat is being produced in the first place.

That extra heat could be coming from the 50-60w more custom 290Xs consume over the 980... It could also be that Hawaii itself is much more dense than GM204 and requires a more robust heatsink to be able to work at what are comfortable temperatures for us (remember AMD said it can withstand 95° without a single problem). Hawaii packs 6200 million transistors in 438mm² (14.15 million/mm²) while GM204 is 5200 million transistors in 400mm² (13 million/mm²)... huh. nvidia hasn't been that dense so far, interesting.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
A lot of people jumped on the "290/X is hot and loud, LOLZ!!!1!!11!one!!" bandwagon and won't re-evaluate that stance.

I owned the reference 290 and it was "hot and loud, LOLZ!!!1!!11!one!!" I'd give the custom cooler version a shot though.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Out of all the 290/290x cards,the only one i truly love out of the bunch are the tri-x and Lightning 290x. Lightning more.Doesn't seem like any of the dual fan card models are even worth considering.:)
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
The heat is directly related to the power consumption. Even if you have a card with custom cooling that can pull the heat away so fast the temps stay low on your GPU, the card still is creating the same amount of heat. It's just the better coolers pull it away and push it elsewhere, depending on the design.

Most custom coolers pull the heat off the chip and blow it in your case. If your case has good air flow, the heat is spread out and eventually pushed out of your case.

If your using a 250watt GPU, it generates heat in proportion. The better the cooler, the faster that heat is pulled off the chip and pushed elsewhere.

This means you can have a GPU that only uses 150watts have a higher on chip temperature that one that uses 250watt. That doesn't mean the 150watt chip is creating more heat, that just means the cooler on it is doing a worse job than the one on the 250watt chip.

So basically, the heat generated doesn't change much from reference 290x to triX. It's just the triX is way better at pulling the heat off the chip and dissipating it in your case. Spreading the heat out across a larger volume, it becomes easier to transferred out.

As for the gtx 480, no no.
There were plenty of vendor custom cooling options for starters but it never was able to shake its reputation. It was always known a hot power hog. Even though the later 480s had much much better thermals. The difference in that situation was nvidias action. They quickly respun the gf100 and rushed out a gtx 580.

The truth is that the gtx580 used about as much power as the 480. To its defense, it was a full chip though, nvidia was only able to manage cut down gf100s for their 400series. But the gtx580 still was a very power hungry card, its just that nvidia was able to keep it cool without it sounding like a plane taking off. There was also the 6970 which used more power than the 5870, so the situation didn't look nearly as bad.

Actually, the 580 stayed pretty cool and wasn't too loud. It addressed all the downfalls of the gtx480, the major complaints anyway.

The 480 was never able to shake its negative image. It's still referred to today, all the time. And it's remembered for being a hot, loud, power hog.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
[Nope]


This is a technical discussion, not a place to put NSFW youtube videos.

-Rvenger
 
Last edited by a moderator:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I owned the reference 290 and it was "hot and loud, LOLZ!!!1!!11!one!!" I'd give the custom cooler version a shot though.

No kidding, and then add in the bitmine craze. I've supported Radeons for my personal rig for years, but even I couldn't justify that terrible cooler decision by throwing my money at them.

And of course the AMD crew told me to wait for custom cards. So I sort of did, and then the price of the 290X started to reach insane price points:
http://anandtech.com/show/7758/radeon-r9-290x-retail-prices-hit-900

When custom cards finally did hit, there was no way I was going to accept the balloon cost. I ended up geting a GTX 780 Lightning for less than what a stock R9 290 was going for, and I haven't looked back.
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
The heat is directly related to the power consumption. Even if you have a card with custom cooling that can pull the heat away so fast the temps stay low on your GPU, the card still is creating the same amount of heat. It's just the better coolers pull it away and push it elsewhere, depending on the design.

This isn't true, power consumption goes down with lower temperatures. CPUs/GPUs running at a lower temperature draw less power than the exact same CPUs/GPUs at a higher temperature.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
They draw less power the cooler they are. But 250watts is still 250 watts that has to be dissipated
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
They draw less power the cooler they are. But 250watts is still 250 watts that has to be dissipated

Yeah, most of these cards are going to have their power consumption rated likely around the point of peak efficiency. That is, at moderate GPU temperature running under a full "average" workload (gaming, not Furmark or high compute loads).

What happens is the VRMs and the transistors, iirc, become far less efficient at higher operating temperatures. Thus, to satisfy the workload and queue, the system first tries to draw in more power to crunch data at the same speed. If this continues, that contributes to additional heat, and if it crosses a threshold, then the throttling begins.

But even at the coldest of temperatures, these chips are going to draw full power as designed, efficiency doesn't really continue to improve for infinity, it slows until there are diminishing advantages. Liquid Nitrogen cooling won't magically make the system 20% more efficient on power draw than a machine well-cooled by air or even water.

And thus, a 250W card is still going to draw 250W under heavy workloads. As all integrated circuits are not 100% efficient, some input power will be lost as heat. Which means it gets removed by the cooler and, as you said, dissipated into the system (or out, with blower designs).
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
It is the other way around. He just mistyped.

Yeah, brain fart. Typed it right the first time.

They draw less power the cooler they are. But 250watts is still 250 watts that has to be dissipated

Right, but if you're running cooler, you're drawing less than 250w.

And thus, a 250W card is still going to draw 250W under heavy workloads.

That depends on how the manufacturer is measuring and displaying the power draw. Depending on the temperature could be more than 250w or less than 250w under a heavy workload depending on if they used averages, best case scenario, worst case scenario, or anything in between.
 
Last edited:

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
Gosh, don't you guys know about Ohm's law? Cooling the circuits reduces resistance. IF the voltage/speed of the chip remains constant, lowering the resistance will lower the current used.
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
Gosh, don't you guys know about Ohm's law? Cooling the circuits reduces resistance. IF the voltage/speed of the chip remains constant, lowering the resistance will lower the current used.

This is just wrong. V = IR. If V stays constant, and R decreases then I must increase. The reason higher temperatures increase power consumption is due to semiconductor physics as IDC explains, not Ohm's law.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Gosh, don't you guys know about Ohm's law? Cooling the circuits reduces resistance. IF the voltage/speed of the chip remains constant, lowering the resistance will lower the current used.
You have it backwards. But a piece of silicon has another factor involved, the hotter it gets the higher the current leakage, so even though some parts of the die will draw less power because resistance goes up, that is more than offset by the leakage.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,181
23
81
You guys are right, guess thats what happens when you're >20 years out from college. So in these tiny little circuits, the leakage is much more of an issue than resistance?
 
Feb 19, 2009
10,457
10
76
But even at the coldest of temperatures, these chips are going to draw full power as designed, efficiency doesn't really continue to improve for infinity, it slows until there are diminishing advantages. Liquid Nitrogen cooling won't magically make the system 20% more efficient on power draw than a machine well-cooled by air or even water.

Not true actually, anyone who has played with water cooling can attest to it, running the same spec (clocks, vcore) GPU at the typical =>80-85C versus sub 60C will save power. Mine on water was loading 200W each while bitcoin mining, but at 94C stock reference, they were loading 250W. It's a huge difference.

Liquid Nitrogen will lower power usage even more if clocks/vcore are constant. ;)
 
Status
Not open for further replies.