• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FurMark Lies? (Confused)

*self-censor engaged*

tl;dr:

• I have a Kill-A-Watt

• I know FurMark is unfair because it loads GPUs more than gaming does normally, producing unfairly high temps and power usage numbers.

• My GeForce GTS 250 used 120w more running FurMark than at idle, which is way more than spec. (110 - 25 = 85w difference.)

• My new 7750 uses 40w more, which is less than spec. (55-10w = 45w difference.)

• Do any cards (and the 7750 in particular) go out of their way to avoid FurMark?
 
Yes they do. Nvidia and AMD have both programmed drivers to recognize the Furmark executable, and throttle accordingly, regardless of temperature or power usage. This can usually be disabled, but it will throttle it by default.
 
Yes they do. Nvidia and AMD have both programmed drivers to recognize the Furmark executable, and throttle accordingly, regardless of temperature or power usage. This can usually be disabled, but it will throttle it by default.
It's a bit more complex than that. On AMD cards with PowerTune, it has a very specific global throttle. If anything causes it to exceed TDP, it will pull back performance. So it's not just detecting FurMark, but operating at a hardware level enforcing the real TDP.

As for the GTS 250, NVIDIA doesn't have a throttle on that card, so their TDP is based around "real world" conditions. The actual TDP is much, much higher as the OP has seen.
 
Fact of the mater is that you shouldn't use Furmark. Just for kicks I ran it yesterday to test out my Lightning 7970's overclock. In Metro 2033 my card will just touch 65c @1200 mhz and that's peak temp. Furmark had no troubles bringing the card to 90c in less than 5 minutes. The overclock remained stable but I saw no reason to continue "cooking" my card for no reason
 
Furmark is a synthetic benchmark designed specifically to stress cards as hard as possible, which is exactly what it does.

When testing stability of hardware you want to stress it as much as possible so you know that it's stable under all conditions. If your card struggles with power usage or temps, then it's the fault of the card and not the application, the application works within the standards layed out by DirectX and the Driver instructions for the hardware, in other words it's not doing anything wrong.

Both AMD and Nvidia do application detection and throttle their GPUs when running Furmark, if you want to really test properly you'll need to disable that application detection or fool it by renaming the Furmark executable.
 
Back
Top