Nvidia shows some details on Cooling chamber on Upcoming Card Plus Video of Black Ops

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
What I have been reading on these forums over the past few days (from people such as Apoppin and others) is that Furmark stresses the GPU to an 'absolute worst' level which is impossible to encounter in any real world scenario. If this is true, I'm not sure how rational a yardstick Furmark is. It might be ideal for testing overclocking stability, but in terms of power draw...perhaps gaming/cuda whatever scenarios are sufficient.

You don't think its worth seeing what a card does in worst case scenarios?

Before cars are sold, they are tested waaay beyond what a regular user would put it through. They do it to make sure it absolutely doesn't fail when its put through regular use.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I do. Didn't have a problem with extreme testing ; as long as it is an attempt to offer apples to apples. The key for me is to see how the GPU's were in a higher quality case though considering these are enthusiast priced products. Don't play games in an open environment -- actually in a high quality case with nice air flow. Appreciated the testing done with both environments and do appreciate worse case tests in everything -- torture tests if you will.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I do. Didn't have a problem with extreme testing ; as long as it is an attempt to offer apples to apples. The key for me is to see how the GPU's were in a higher quality case though considering these are enthusiast priced products. Don't play games in an open environment -- actually in a high quality case with nice air flow. Appreciated the testing done with both environments and do appreciate worse case tests in everything -- torture tests if you will.

Yeah, I remember when anandtech tested some 4870s (im not sure if they were reference ones or not) and some of them failed when running furmark. So yeah, I think torture tests are perfectly valid.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I recall a general consensus Nvidia changed the way they calculated their TDP on the GTX480 because they couldn't get it under the 300w requirement using their usual way of measuring it, which was born out when Furmark did indeed put it well over 300 TDP.

This was inherently dishonest and inserted confusion into what was a fairly standard measurement.

Furmark is designed to fully load GPUs and is a rational yardstick to use to determine the TDP of a GPU and give the buyer a known standard to base their purchase on.

AMD did/does use this same method of measuring TPD for the CPU's though. Soooo it's a case of both companies taking a measurement that does not have a specific industry-standard method and using it in a manner that best reflects how their product operates. They both either do or have done it so until there is a industry spec that says "this is how TPD should be measured" then any and all companies can choose to advertise how much TDP their products draw. Nobody games with Furmark, and no game comes close to matching furmark's ability to raise TDP so it's kind of an iffy situation.

The best way to clear up all of this confusion is that there should be two forms of measurement for CPU's and GPU's - average TDP at load and absolute maximum TDP at load.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So yeah, I think torture tests are perfectly valid.

Yes, I think most agree that torture testing with Furmark for say overclocking stability is valid. However, going back to your car analogy, they test cars in the real world (i.e, on public roads, in Nevada dessert, in cold climates, etc.). Therefore, if you took the car to the same places, it's going to work without issues.

However, we are talking about using Furmark to measure power consumption. You can't ever achieve the same power consumption with any game. In other words, with Furmark you are testing the power circuitry of the videocard and its ability to sustain stability beyond real world load conditions - and that's fine for component testing for a peace of mind. Therefore, while Furmark is good for stress testing, it shouldn't be used as a gauge of power consumption.

Again, let's look at the definition of TDP.

"The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running real applications."

^^^ By this very definition, Furmark will exceed TDP ratings for most videocards. What has changed after HD4870 series is AMD revised the driver so that Furmark no longer loads the cards the same way. In other words HD4870 cards had no driver protection, so they would be unrealistically loaded to 100%.

NV uses "gaming load" when listing its TDPs, whereas ATI uses "Furmark load" when listing its TDPs.

AMD has improved the detection of FurMark and OCCT by adding a more serious technique based on the ratio of texture to ALU instructions. In other words, AMD is doing everything in its power to not allow Furmark to load its cards to 100% - AMD did this to protect its valuable customers from videocard failures.

Blastingcap, the idea that AMD is much closer to TDP in Furmark is not very important because besides the software driver protection mentioned above, AMD went 1 step further and has enabled PowerPlay protection in hardware starting with HD58xx series.

"For Cypress based cards, AMD has hard-wired the protection and has implemented a hardware solution to the VRM problem, by dedicating a very small portion of Cypress’s die to a monitoring chip. This chip monitors the VRM. If the chip detects a dangerous situation (overload), the chip will immediately throttle back the card by one PowerPlay level."

Therefore, one cannot compare the Furmark loads on HD4870 vs. HD5870/68xx series because Furmark no longer works the same way on modern AMD videocards. I applaud AMD for doing this because Furmark does not realistically load the GPU's power circuitry anyways.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I recall a general consensus Nvidia changed the way they calculated their TDP on the GTX480 because they couldn't get it under the 300w requirement using their usual way of measuring it, which was born out when Furmark did indeed put it well over 300 TDP.

This was inherently dishonest and inserted confusion into what was a fairly standard measurement.

Furmark is designed to fully load GPUs and is a rational yardstick to use to determine the TDP of a GPU and give the buyer a known standard to base their purchase on.
That maybe your opinion, but its also fact that ATI used to treat furmark like a virus and purposely altered its behavior. probably because people were burning cards at a alarming rate, in the name of 'testing' them.
In the end , how either company rates TDP, they have to answer to the oem builders. Who engineer a product they have to warranty.
I'm not a fan of it being used to represent any kind of real world power pull.
In part because, as heat rises , so does inefficiency. They start to cascade rapidly. You do that test to 4-way SLI -480's for example and you could probably create a number so higher than real world gaming could push those same cards it would probably boggle the mind :)
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Yes, I think most agree that torture testing with Furmark for say overclocking stability is valid. However, going back to your car analogy, they test cars in the real world (i.e, on public roads, in Nevada dessert, in cold climates, etc.). Therefore, if you took the car to the same places, it's going to work without issues.

However, we are talking about using Furmark to measure power consumption. You can't ever achieve the same power consumption with any game. In other words, with Furmark you are testing the power circuitry of the videocard and its ability to sustain stability beyond real world load conditions - and that's fine for component testing for a peace of mind. Therefore, while Furmark is good for stress testing, it shouldn't be used as a gauge of power consumption.

Yeah, I'm not saying it should be used to gauge power usage. Just saying its okay to use to see if a card will burn out or not.

The car analogy, they take them to a spacial test facility where they drive them over jumps and roads that look like the paths from Unigene heaven at 70mph.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The car analogy, they take them to a spacial test facility where they drive them over jumps and roads that look like the paths from Unigene heaven at 70mph.

I guess you haven't been to Russia then, where the roads collapse under heavy trucks and the potholes are the size of an average farm animal. Manufacturers even ship cars with more stringent tire rubber requirements to sustain the extra stress :awe:
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
You don't think its worth seeing what a card does in worst case scenarios?

Before cars are sold, they are tested waaay beyond what a regular user would put it through. They do it to make sure it absolutely doesn't fail when its put through regular use.

In my first post I said 'perhaps', because I was thinking about it. Cars are smashed into walls, and how the chassis collapses is interpreted in a way that indicates safety for the crash test dummies and so forth. Cars aren't, however, crushed with a wrecking ball, nor do they have to survive such a test in order to be considered safe.

I can't seem to find Apoppin's post on Furmark, so I can't quote it here. The gist of that post seemed to be that the test is way beyond anything the card can ever face. Just as the wrecking ball is beyond anything the car will have to face. If the claim about Furmark is analagous, then using that test seems strange to me in order to make a claim about possible power usage. I'm sorry that I can't find Apoppin's post, or this would be more clear. My point is that I'm not sure what Furmark is meant to show if it produces power draw levels that are impossible to mimic in gaming and so on.

Here's apoppin on the issue:

It [Furmark, my addition] is dangerous. It is like a virus that overdraws the circuit and it *can* damage your PSU, MB and/or video card.

Furmark shows *nothing* related to anything you will *ever* encounter in gaming; power draw in any gaming never comes close to anything like Furmark's ridiculous test.
- so what good really is Furmark?
:confused:


What nonsense. Furmark is a broken torture test that shows absolutely nothing related to real world usage. i am not using it in my testing any longer.

So if he is right, then I think there can be a legitimate case made for saying that it's a useless tool to indicate anything about real-world use.
 
Last edited:

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
This subforum is frightening. I can't believe the stuff people will argue over.

Why can't everybody just agree that better cooling with less noise is a good thing?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I can't seem to find Apoppin's post on Furmark, so I can't quote it here. The gist of that post seemed to be that the test is way beyond anything the card can ever face.

Using Furmark can be extremely dangerous precisely because of what you said. When manufacturers design VRM and MOSFETs, they engineer them to support maximum real world power consumption under real applications which the videocard will be used for. Furmark is a power virus and is able to overload every part of the GPU (think about it, can you utilize 100% of the GPU with no aspect of the GPU design being underutilized in a game? Of course not.) As a result, the VRM/MOSFETs are simply not designed to handle a maximum theoretical GPU workload. This can result in unnecessary videocard component failure.

So how representative is the power consumption from such a test? :|

Why can't everybody just agree that better cooling with less noise is a good thing?

Great post. New and better cards from both AMD or Nvidia ensures competition; so in the end we win from price/performance wars and new technological innovations.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
This subforum is frightening. I can't believe the stuff people will argue over.

Why can't everybody just agree that better cooling with less noise is a good thing?

The highly amusing thing is that it's often those whom people would class as NV sided who are saying noise is a nothing, and therefore saying this whole effort is pointless.

Sometimes people want to argue even when a sane person would say it's good, end of.

No one really wants to hear a presentation on such improvements, but everyone appreciates them.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The highly amusing thing is that it's often those whom people would class as NV sided who are saying noise is a nothing, and therefore saying this whole effort is pointless.

Sometimes people want to argue even when a sane person would say it's good, end of.

No one really wants to hear a presentation on such improvements, but everyone appreciates them.
I don't know. I am not sold on a better heatsink, but if heat/noise/power consumption was 480s problem, then the upcoming card seems to have improved.

The one thing that I am interested in future Nvidia card is if they will release a card that uses all the SMs on board. If they are going to build it with 10000 CUDA cores and disable 9990 of them, then I will personally pass. In fact, I will pass if not all on-board parts are used.

I alway wonder why don't Nvidia release a 480 with 512 cores on water? I guess heat isn't the only problem.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The highly amusing thing is that it's often those whom people would class as NV sided who are saying noise is a nothing, and therefore saying this whole effort is pointless.

Sometimes people want to argue even when a sane person would say it's good, end of.

No one really wants to hear a presentation on such improvements, but everyone appreciates them.

Well the only thing nosier than Fermi, because Fermi had not come out yet was the new 5870 reviewed here at Anands.
So I guess ? ATI fans scored a win, when the faster card was also the louder card, lol

Nobody wants to simulate a vacuum inside their computer. Reference blower coolers are always louder , imo.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
36
91
Thanks. If the GTX 580 card is 15-20% faster than a GTX 480, then it'll only consume a bout 5-10% less power (ballparking it, obviously). Looks to be ~300W again then, yikes.

LOL. Way to moce the goalposts. Most people were saying that IF nV could release a 512 "Cuda core" version of the 480, it would consume even more power.

If they can do 20% better performance using less power at higher clocks, this fast...I think it will be a win for them.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
That's funny, my entire rig only consumes ~230W playing Crysis Warhead when my 5850 is clocked to 1000MHz. Looks like you need to find better sources.
Thanks for providing yourself as an example :rolleyes:

Really? That sounds awefully low for a 4ghz i7 + a heavily OCd 5850. I kind of doubt that...
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So better cooling/lower noise is now a bad thing because only eco-freaks would like it?

Is this thread from another universe?


No it's a thread where AMD dosn't have the fastest GPU...and thus other factors are dragged into the debate.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
No it's a thread where AMD dosn't have the fastest GPU...and thus other factors are dragged into the debate.

Dude. It's a thread about the cooling system on the GTX580.
How is power, noise and cooling "other factors"? They ARE the factors.
Unless you want to talk about Black Ops...?


(And if you don't care about noise and power... why are you even in this thread?)
 
Last edited:

Dark_Archonis

Member
Sep 19, 2010
88
1
0
This subforum is frightening. I can't believe the stuff people will argue over.

Why can't everybody just agree that better cooling with less noise is a good thing?

The reason is that some people here are marketing puppets, and believe it or not, are actually being paid to have meaningless arguments on tech forums.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Dark Archonis, Lonbjerg was the only person saying he identifed more noise with more power, and thus didnt want a quieter card.

I think most people would agree that less noise = better. The end.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
For those who don't like cooler and quieter.....the exit is that way---------------->

This tech looks cool. NV may not have invented it, but any better cooler designs are always welcomed. When you are talking about a high-end card, who cares if it costs $10 more when it retails for $400? :)

Nice post OP, thanks for the link!
 

Puffnstuff

Lifer
Mar 9, 2005
16,146
4,844
136
I'm glad to see nvidia using a better hsf on their gpu's. The fans on my 480's are why I'm water cooling them now. When gaming they sounded like jets taking off and it was not tollerable late at night when the noise was overbearing.