• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Interested in torturing your 580 yet?

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I am surprised videocard torturing has not yet become an official sport of the online hardware community, with prizes given to those who can achieve the highest power draw. Just a shame really that both AMD and NV do everything in their power from letting you max out your FPS in this game.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Why is everyone making such a big deal about this anyway? Both vendors say Furmark does not represent anything realistic. I'm sure they know better than anyone what their cards are likely to go through. Even DICE runs don't put the cards under that kind of stress.

4870s and GTX280s have been used in extreme overclocking attempts and the stock components coped fine. As soon as you fire up furmark things start popping and failing.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
The question is what if some game designer decides to use an effect similar to what is in furmark in their game at some point? Then just about everybody is SOL.

The whole power virus thing is nothing but a cop out. You'll never hear a CPU or memory manufacturer telling you(or get away with telling you) that it's okay that certain programs destroy the devices and the solution is to just don't run them.

Imagine if Intel shipped CPUs that failed Linpack and they just said "Don't worry guys, just don't run Linpack" There would be quite an uproar.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
The question is what if some game designer decides to use an effect similar to what is in furmark in their game at some point? Then just about everybody is SOL.
The difference is that GPUs with all their parallel hardware just aren't designed for 100% utilization, because there's no reasonable algorithm out there that would use it that way.
Which is completely different for CPUs, because solving systems of linear equations (that's basically what linpack is doing) really isn't anything extraordinary.
 

Jdawg84

Senior member
Feb 6, 2010
256
0
0
I'd gladly let someone do it to my 580's if they know what they're doing and are experienced.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
I'd gladly let someone do it to my 580's if they know what they're doing and are experienced.
Why? Do furmark and OCCT even show points or some other grading system? I don't see why anyone would want to put their rather expensive GPU at risk just so that it can run furmark faster..
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The difference is that GPUs with all their parallel hardware just aren't designed for 100% utilization, because there's no reasonable algorithm out there that would use it that way.
Which is completely different for CPUs, because solving systems of linear equations (that's basically what linpack is doing) really isn't anything extraordinary.
folding@home, 99-100% load on GPU.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The whole power virus thing is nothing but a cop out.

Before FurMark was invented, how would one test a videocard's power consumption? Under real world applications. If FurMark is the only accurate measurement of real world power consumption, then every power consumption test that preceeded its invention was incorrect. Since this statement is incorrect, FurMark cannot be the only accurate measurement of real world power consumption.

In fact, FurMark by its very definition is a program that tests the maximum power consumption of a GPU, which cannot be achieved under any real world application. This is one of the main reasons why it was invented in the first place. Both AMD and NV have denounced it as a useful application. Should we test a motherboard's stability in a 95*C heat chamber to claim that it's stable in the real world? And if it fails, then the motherboard is not sufficient for real world usage at 20-25*C room temperature?

folding@home, 99-100% load on GPU.

Fur rendering is especially adapted to overheat the GPU. The load on the GPU is not comparable to any real world application. I have run Radeon 4890 on MilkyWay@Home for more than half a year 24/7. It is not even close to the loading FurMark did to my card.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I wonder if half the people trying to be upset about something that both nV and AMD are doing realize that their cars are artifically limited in speed, usually because of the tire ratings?

Do you guys go out and buy the $300 programmer to remove the governer? I doubt it.

As long as the nV and AMD cards perform how they are supposed to, I couldnt care less if they disable a program that is basically useless.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I wonder if half the people trying to be upset about something that both nV and AMD are doing realize that their cars are artifically limited in speed, usually because of the tire ratings?

Do you guys go out and buy the $300 programmer to remove the governer? I doubt it.

As long as the nV and AMD cards perform how they are supposed to, I couldnt care less if they disable a program that is basically useless.

Oh they are limited by the tires! They'll buy N rating tires hit the pavement and gun it to 120 blow a tire, skid out, and then sue someone.

God Bless America!

EDIT: What I meant is I see someone doing this with this info. Unlock it, crank the juice, blow their GPU then sue someone. haha.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As long as the nV and AMD cards perform how they are supposed to, I couldnt care less if they disable a program that is basically useless.

Would a car manufacturer pay for your health bills if you got into a car accident because your tire blew out when you knowlingly and willingly were driving the car above the tire's speed rating limit? Not in Canada :). Not sure about America, but most likely the driver would lose the case.

The alternative would be for nV and AMD to continue issuing RMAs for a certain % of failed videocards as a result of unrealistic torture testing. I am pretty sure AMD/nV were not too thrilled about fulfilling warranty demands for failed MOSFETs/VRMs on 4870/90/GTX200 series, etc. AMD was the first one that took the step to limit this "fun" with HD58xx series and now GTX580 follows suite. I am glad to see every videocard will come protected out of the box from now on.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Would a car manufacturer pay for your health bills if you got into a car accident because your tire blew out when you knowlingly and willingly were driving the car above the tire speed rating limit? Most likely not.

The alternative would be for nV and AMD to continue issuing RMAs for a certain % of failed videocards as a result of unrealistic torture testing. I am pretty sure AMD/nV were not too thrilled about fulfilling warranty demands for failed MOSFETs/VRMs on 4870/90/GTX200 series, etc. AMD was the first one that took the step to limit this "fun" with HD58xx series and now GTX580 follow suite. I am glad to see every videocard will come protected out of the box from now on.

In the same context, should a manufacturer be required to repair it if a user damages their product in an otherwise unrealistic manner?

These programs were designed with mechanisms to push the cards to a level the manufacturers probably never intended for thus the inclusion of limitors.

So in the end, why bother disabling anything? Just revoke the warranty if someone blew their card doing this. No?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just revoke the warranty if someone blew their card doing this. No?

I think it would be too difficult for AMD/NV to prove that the card failed because the end user torture tested it with FurMark. The card will be shipped dead in the box, and little Johnny will say it blew up from Crysis so that he could get his new card. We have seen some members on our forum issue RMAs for failed overclocked processors, as opposed to taking responsibility for overclocking, which is not covered under warranty. So I think doing what AMD/NV did is more effective than pointing fingers at your loyal customers about why the card became defective. This way, it takes "guessing" out of the equation. Most importantly, it reduces the chances of a card failing, which actually benefits the consumers.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I think it would be too difficult for AMD/NV to prove that the card failed because the end user torture tested it with FurMark. The card will be shipped dead in the box, and little Johnny will say it blew up from Crysis so that he could get his new card. We have seen some members on our forum issue RMAs for failed overclocked processors, as opposed to taking responsibility for overclocking, which is not covered under warranty. So I think doing what AMD/NV did is more effective than pointing fingers at your loyal customers about why the card became defective. This way, it takes "guessing" out of the equation. Most importantly, it reduces the chances of a card failing, which actually benefits the consumers.

Good point. I guess I'm more of a 'screw you' kind of guy. haha.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
Is this some type of joke someone came up with to cook as many 580s out there as they can? I wonder which idiot would remove the voltage limiter and pump that much power through a card whose components probably are not designed for such a surge. But I guess someone's bound to try it.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76

It wouldn't be hard to argue that Linpack/LinX made Prime 95 obsolete because machines that could indefinitely pass prime 95 could crash in Linpack due to the vastly different heat and power consumption it creates. Linpack set a new standard for what was considered stable hardware. Lots of laptops produced today are incapable of passing Linpack at stock settings. Personally I think this is unacceptable.

My point still stands. All it would take is a game to use the same visual effect furmark does for the "that load will never be reached in a real world application" argument to be thrown out the window. Then what? Are AMD and Nvidia going to send out documentation saying not to run certain types of code through their cards?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It wouldn't be hard to argue that Linpack/LinX made Prime 95 obsolete because machines that could indefinitely pass prime 95 could crash in Linpack

Machines that can infinitely pass Linpack can fail under 3dMark06 and Crysis too. Passing Linpack doesn't provide 100% guarantee that your system is stable; and neither did passing Prime95. This is why ultimately, I have decided to stress test all my overclocks with the most demanding programs I run. If my CPU passes Linpack/Prime95 but fails under Dirt 2, it doesn't make my overclock stable for me.

Also, it's doubtful that any game will come out to stress the GPU like FurMark since no game can utilize 100% of the gpu given its parallel architecture and an inevitable shader, texture, memory bandwidth or ROP bottleneck somewhere.

AMD and NV pick their components from MOSFETs, VRM, GPU cooling heatsink based on TDP. The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running real world applications. AMD and NV do not guarantee anything in regard to "maximum power the chip could ever draw." If you wish to test that maximum power for yourself, and your card breaks, then what? Should you be entitled to the warranty?
 
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Of course, Linpack and Prime 95 mainly test the CPU stability, Crysis and 3dmark06 test more than just the CPU. At the very least, if your machine passes Linpack but fails Crysis or 3Dmark, you know that the CPU is not the culprit. The OCCT3.0 power supply test is probably the closest thing we have to a single torture test that can cover the whole system.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Save your rage for when a game actually is affected by this. I have a feeling you will be waiting for a while.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
I remoober when Doom 3 came out (yeah, i'm that old). The id peoples said something about "This game is nuts dudes. This game is so crazy, it may screw up your GPU overclocks! Especially if you're lookign for the red keycard. The blue keycard is okay. Don't bother overclocking my ninjas, we can only spawn four monsters at a time anyway because Hellspawn, for some reason, follows immigration and greencard (get it!) restrictions."

Sure enough, my 9700 Pro overclock failed when running Doom 3.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This is not the same thing as prime 95 is to a cpu.

This is similar to revving your engine to the red line in the tachometer for endless amounts of time, and expect NO negative effects or impact.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Potentially good for overclocking, more juice and whatnot. Still, 350 watts and its not even overclocked??? WTF ...