EXPreview: GeForce GTX 560 Ti Power Consumption Protection Analysis

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Pretty sure Anand took a look @ this when 580 came out, but it'd be nice to see how this affects gaming performance @ higher clocks/volts.

Here's the article @ expreview.com...

http://en.expreview.com/2011/02/16/...er-consumption-protection-analysis/14670.html


This thread is currently in violation of posted guidelines on 3 counts, see post #12.

OP has until Noon EST to address the violations lest the thread be locked and the OP be cited for said violations.

Community members are NOT allowed to pile-on, that is thread-crapping and it too is a violation of the posting guidelines.

Keep your posts technical and on-topic, or don't post in this thread.

Moderator Idontcare

Edit 2: Thread Title has been edited to conform with forum guidelines.
 
Last edited by a moderator:

BD231

Lifer
Feb 26, 2001
10,568
138
106
So basically nVidia can't figure out how to run Fermi at full load safely. That's pretty sad.
 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
So basically nVidia can't figure out how to run Fermi at full load safely. That's pretty sad.
Cmon man...at least read the article. Nothing happened until they increased voltage from .987 to 1.15.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I don't think a lot of people realize how exploitable modern GPUs are. People see that an overclocked card throttles itself during an exploitative test and freak out because "nVidia can't figure out how to run Fermi at full load safely"; when in fact, they don't realize that these safety systems actually allow the GPU makers to increase performance for real world workloads.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
I don't think a lot of people realize how exploitable modern GPUs are. People see that an overclocked card throttles itself during an exploitative test and freak out because "nVidia can't figure out how to run Fermi at full load safely"; when in fact, they don't realize that these safety systems actually allow the GPU makers to increase performance for real world workloads.

The OP is blatantly stating there are power saving features on this card but they in fact only serve to benifit a stress test. According to you somehow limiting my overclock potential is beneficial even though this feature has zero effect on cards not running furmark or some kind of overly intense stress test.

Realizing a program has the ability to burn your cards up and doing something about it is not saving me any power where it counts and in turn serves to limit overclocks more than protect the end user.

Thanks Ben but your logic is flawed on this one.
 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Oh..I just went with a catchy title is all. ;)

I would like to see how this affects games though.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
The OP is blatantly stating there are power saving features on this card but they in fact only serve to benifit a stress test. According to you somehow limiting my overclock potential is beneficial even though this feature has zero effect on cards not running furmark or some kind of overly intense stress test.

Realizing a program has the ability to burn your cards up and doing something about it is not saving me any power where it counts and in turn serves to limit overclocks more than protect the end user.

Thanks Ben but your logic is flawed on this one.

By your logic we should run houses without breaker panels.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91

BD231

Lifer
Feb 26, 2001
10,568
138
106
By your logic we should run houses without breaker panels.

Their desire to lock modification abilities will not be hidden by some power saving ploy, maybe on you yes but not me. Nvidia knows they have some serious headroom on their chips and they're doing what they can to limit the overclocks now.

Like it's really any surprise only reference models have this feature while custom vendors are avoiding it?

Alright Mr breaker panels.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Their desire to lock modification abilities will not be hidden by some power saving ploy, maybe on you yes but not me. Nvidia knows they have some serious headroom on their chips and they're doing what they can to limit the overclocks now.

Like it's really any surprise only reference models have this feature while custom vendors are avoiding it?

Alright Mr breaker panels.


Overclocking has never been a right. It is rather a privilege for the customer.

NV is selling you a GTX 560, the only expectation they have to fulfill is that it runs as a GTX 560 should. The same goes for AMD or any other company, I'm not defending NV.

The power protection is to protect themselves from unnecessary RMAs, not to put a burden on your greediness for speed, silly.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The OP is blatantly stating there are power saving features on this card but they in fact only serve to benifit a stress test. According to you somehow limiting my overclock potential is beneficial even though this feature has zero effect on cards not running furmark or some kind of overly intense stress test.

Realizing a program has the ability to burn your cards up and doing something about it is not saving me any power where it counts and in turn serves to limit overclocks more than protect the end user.

Thanks Ben but your logic is flawed on this one.
First, the purpose of power consumption protection doesn't seem to be saving electricity, but more like preventing the card from drawing excessive electricity. I don't know what games you play and how often your video card ends up 100% load. Usually, stress program will do that, and therefore the protection kicks in. You somehow confused this as it only works when the card is stressed without knowing that most of the application you use doesn't really stress the video card much.

Put it this way. I can clock my video card to an extend that it will run WoW fine, but won't survive futuremark. Without thinking I brought the latest and greatest game that is known to be a video card burner, ran the game and in no time my video card dies. Now if power consumption protection will prevent the card from instant death in this case, then it is a great idea.

The reason why manufacturer chooses to not include this may be due to cost knowing people, like your kind, will probably won't pay an extra 10 bucks on something that they will probably remove from day 1.

IMO this is a smart design as the amount of electricity going into the card is proportional to the heat generated by the card. Since they know the radiate compacity of the heat sink, they know how much electricity the card can take without damaging it. With that protection installed, overclocker can play with whatever settings they like without the need to worry about killing the card(In theory of course).

Their desire to lock modification abilities will not be hidden by some power saving ploy, maybe on you yes but not me. Nvidia knows they have some serious headroom on their chips and they're doing what they can to limit the overclocks now.

Like it's really any surprise only reference models have this feature while custom vendors are avoiding it?

Alright Mr breaker panels.
I believe that is your own opinion. You can clock it up as high as you want with 8800GTS and 560 will still single hand beat it. Yes we can OC a 460 to match a stock 560, but it isn't easy, and that is without any protection.

You seems to think that dynamic voltage adjustment was in the computer world since stone age. The only problem of dynamic voltage adjustment is monkey see, monkey do, the 570 thread is a good example. I'm not trying to say people who OC 570 are dumb, but if there are so many overclocking tools (softwares) that people can download, how can manufacturer prevent 1d10t user errors? Experts like you, or any overclockers who actually knows a thing or 2 would be able to remove whatever protection there is to proceed with their goal as nothing a solder iron can't fix. Such protection does serves as a fail safe feature for dummies and average Joes.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Title of review. I don't think this is a rumour/news site.

"GeForce GTX 560 Ti Power Consumption Protection Analysis"


"Reference GTX 560 Ti in overclocking state showed its real shape,its power consumption in two test software was obviously lower than another two graphic cards,and the maximum power consumption was only 350W,with gap of up to 130W."



I think all cards should have this protection enabled by default ,with no option to disable.
It would save a lot of RMA's, which saves us money.

Sorry OP , if I had the protection I would run some games for you, but my card didn't come with it.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Title of review. I don't think this is a rumour/news site.

"GeForce GTX 560 Ti Power Consumption Protection Analysis"


"Reference GTX 560 Ti in overclocking state showed its real shape,its power consumption in two test software was obviously lower than another two graphic cards,and the maximum power consumption was only 350W,with gap of up to 130W."



I think all cards should have this protection enabled by default ,with no option to disable.
It would save a lot of RMA's, which saves us money.

Sorry OP , if I had the protection I would run some games for you, but my card didn't come with it.
ZOMG HAPPY FINALLY BOUGHT A GTX 460 :thumbsup:
 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Thanks!

you can disable it
On the 560? I haven't paid much attention, because I probably wouldn't want to do that.

Overclocking has never been a right.
Well, I picked up an SC from EVGA and they flat out lied to me about validating the OC to the voltage applied. 1.062 VID, but 1.000V's stable in anything I do.

Sorry OP , if I had the protection I would run some games for you, but my card didn't come with it.
AFAIK, EVGA's SC's a reference, so it should be in place.

21 C ambient, 81 C load.
Auto fan: 50%
Volts: 1.062 V
Case: Stacker 810 w/ a bunch of 1200 RPM SFLEX
OC Scanner settings: 90% load engaged, power monger mode disabled

I've not noticed much difference in temps going from 90% load to 100% load or enabling the power delay. And I'm not positive yet, but even going down to 1.000 V doesn't seem to affect temps either...not sure what's going on.