• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

FurMark Lies? (Confused)

Feb 25, 2011
16,992
1,621
126
*self-censor engaged*

tl;dr:

• I have a Kill-A-Watt

• I know FurMark is unfair because it loads GPUs more than gaming does normally, producing unfairly high temps and power usage numbers.

• My GeForce GTS 250 used 120w more running FurMark than at idle, which is way more than spec. (110 - 25 = 85w difference.)

• My new 7750 uses 40w more, which is less than spec. (55-10w = 45w difference.)

• Do any cards (and the 7750 in particular) go out of their way to avoid FurMark?
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Yes they do. Nvidia and AMD have both programmed drivers to recognize the Furmark executable, and throttle accordingly, regardless of temperature or power usage. This can usually be disabled, but it will throttle it by default.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yes they do. Nvidia and AMD have both programmed drivers to recognize the Furmark executable, and throttle accordingly, regardless of temperature or power usage. This can usually be disabled, but it will throttle it by default.
It's a bit more complex than that. On AMD cards with PowerTune, it has a very specific global throttle. If anything causes it to exceed TDP, it will pull back performance. So it's not just detecting FurMark, but operating at a hardware level enforcing the real TDP.

As for the GTS 250, NVIDIA doesn't have a throttle on that card, so their TDP is based around "real world" conditions. The actual TDP is much, much higher as the OP has seen.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Fact of the mater is that you shouldn't use Furmark. Just for kicks I ran it yesterday to test out my Lightning 7970's overclock. In Metro 2033 my card will just touch 65c @1200 mhz and that's peak temp. Furmark had no troubles bringing the card to 90c in less than 5 minutes. The overclock remained stable but I saw no reason to continue "cooking" my card for no reason
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Furmark is a synthetic benchmark designed specifically to stress cards as hard as possible, which is exactly what it does.

When testing stability of hardware you want to stress it as much as possible so you know that it's stable under all conditions. If your card struggles with power usage or temps, then it's the fault of the card and not the application, the application works within the standards layed out by DirectX and the Driver instructions for the hardware, in other words it's not doing anything wrong.

Both AMD and Nvidia do application detection and throttle their GPUs when running Furmark, if you want to really test properly you'll need to disable that application detection or fool it by renaming the Furmark executable.