Opinions on Nvidia's new GPU binning system?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
While you are correct there are low powered specially binned processors just for that task. There are processors that are of the same batch using different VIDs. Some i7 920s use 1.225 stock, some use 1.2, some use 1.25.

cool, mine is 1.2 v! :)

btw, I think that this is a good business move by nvidia, especially in light of the crappy tsmc yields this gen.
 
Last edited:

Zap

Elite Member
Oct 13, 1999
22,377
7
81
VID != overclockability. I cannot state this enough, actually here:
VID != overclockability
VID != overclockability
VID != overclockability
Some of the best overclocking chips from Intel and AMD can barely pass for the lowest speed bin without using too much voltage, however when their TDP is ignored, these chips shoot to the moon using better cooling.

Right. Doesn't anyone remember those "special" AMD chips that were given out last year? AMD couldn't sell those because they were so leaky, but supposedly under extreme cooling they were simply the highest overclockers.

To be honest I couldn't tell you if any other processors do that. I just know Nehalem does

Core 2 CPUs are like that too. Don't know of any others off-hand, but basically Intel has been doing this for years, and nobody makes a fuss about it.

Simply another mountain out of a mole hill.

But we're talking about NVIDIA here.

"A WITCH! BURN HER!"
 

Nox51

Senior member
Jul 4, 2009
376
20
81
99.9% of end users probably don't know or care what voltage their chip is. Nor should they.

Simply another mountain out of a mole hill.

Don't think you have any credibility making that claim to be quite honest.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
VID != overclockability.
While this is true, VID does serve a relatively reliable indicator of overclockability... its not a given, you CAN have lower VID chips that clock worse, and higher VID ones that clock better... but just because you have exceptions to it doesn't mean the rule of thumb is useless.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
its not a given, you CAN have lower VID chips that clock worse, and higher VID ones that clock better... but just because you have exceptions to it doesn't mean the rule of thumb is useless.
No, actually high leakage chip usually OC exceptionally well as long as you can cool them, so that rule of thumb is completely wrong.. I'm sure Aigo or someone can explain that in more detail..
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
nVidia's new binning system is responsible for the monstrosity that is the GTX465. I can’t ever remember a mid-range card being hotter and louder than a high-end card from the same generation.

That makes the GTX465 one of the worst graphics cards ever to be released.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
nVidia's new binning system is responsible for the monstrosity that is the GTX465. I can’t ever remember a mid-range card being hotter and louder than a high-end card from the same generation.

That makes the GTX465 one of the worst graphics cards ever to be released.

I think it depends on what brand /model gtx 465 your looking at.
Palit and Galaxy have cooler and quieter gtx 465's also.

I can link you to a few reviews that say just the opposite.

Msi Twin frozer II..........
http://www.guru3d.com/article/msi-n465gtx-twin-frozr-ii-review/5

Evga...
EVGA-GTX465-69.jpg


"We went into this review thinking the GTX 465 would exhibit the same acoustical profile as the GTX 470 but we were wrong. Instead of being moderately noticeable over the system fans, we didn’t hear a peep out of this new card even though it exhibited extremely good temperatures throughout testing"


It just cost to dam much for the performance it gives. :(
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I think it depends on what brand gtx 465 your looking at.
No, it depends on the card, and that's the problem. It's possible for the GTX465 to have a higher TDP than the GTX470 while offering less performance. Again, seeing something like that in the same generation is unheard of.

Sure, you can stick a non-reference cooler on there, but it doesn't change the TDP vs performance equation above.

Any GTX465 that uses more power than a GTX470/GTX480 is a faulty card; there’s simply no other way to describe it. But with nVidia’s binning, there’s no way to tell which one you’ll get.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
No, it depends on the card, and that's the problem. It's possible for the GTX465 to have a higher TDP than the GTX470 while offering less performance. Again, seeing something like that in the same generation is unheard of.

Sure, you can stick a non-reference cooler on there, but it doesn't change the TDP vs performance equation above.

Any GTX465 that uses more power than a GTX470/GTX480 is a faulty card; there’s simply no other way to describe it. But with nVidia’s binning, there’s no way to tell which one you’ll get.

I thought they didn't make reference gtx 465's?
They are all non reference right?
So the partners make the crappy coolers at first and now are making better ones it seems?
You tell me.

Edit: This comes from the Anandtech review...
"Unlike the GTX 480/470 launch, NVIDIA is not seeding the press with reference cards. Instead that task has been left up to the vendors, who are selling identical cards that we believe all come from NVIDIA"

So the vendors kept the crappy gtx 470 cooler design?
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I thought they didn't make reference gtx 465's?
They are all non reference right?
So the partners make the crappy coolers at first and now are making better ones it seems?
You tell me.
By reference I mean the blower, ala GTX470/GTX480. But you're missing the point - it's about the TDP, not the cooler.

There’s no rational explanation for the GTX460 to use more power than the GTX470 (given it’s the same generation), unless it’s a faulty part being overvolted to work. That’s what nVidia’s new binning system is producing.

A non-reference cooler won’t change that fact. Sure, it might run cooler and quieter, but it doesn’t change the fact that it’s using more power than a part that offers more performance from the same generation.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
By reference I mean the blower, ala GTX470/GTX480. But you're missing the point - it's about the TDP, not the cooler.

There’s no rational explanation for the GTX460 to use more power than the GTX470 (given it’s the same generation), unless it’s a faulty part being overvolted to work. That’s what nVidia’s new binning system is producing.

A non-reference cooler won’t change that fact. Sure, it might run cooler and quieter, but it doesn’t change the fact that it’s using more power than a part that offers more performance from the same generation.

So you are saying that some/most/all gtx 465's use more power then the gtx 470?

ASUS-GTX465-67.jpg


power_maximum.gif


power_consumption.jpg


23098.png


From Guru 3d........

Graphics card

Advertised GeForce GTX 465 TDP = 200W
System in IDLE = 178W
System Wattage with GPU in FULL Stress = 339W
Difference (GPU load) = 161 W
Add average IDLE wattage ~ 25W
Subjective obtained GPU power consumption = ~ 186 Watt


So you are saying its luck of the draw? You might get one that uses more power then a gtx 470?

Edit:disclamer...I'm in no way defending the crappiness of the gtx 465; :)
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
nVidia's new binning system is responsible for the monstrosity that is the GTX465. I can’t ever remember a mid-range card being hotter and louder than a high-end card from the same generation.

That makes the GTX465 one of the worst graphics cards ever to be released.

It definitely has some tough competition:

http://www.anandtech.com/show/1062
http://www.anandtech.com/show/2231 -

ok, call me a loser, but I started reading the 2900xt article again and noticed that the daily tech links are current. Oh, the wonders of modern technology!
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
OK , so we can put the gtx 465 ,fx5800,2900xt, and 5830 in the crap draw. :)
Imagine how much power it would take to run that draw and how loud it would be? :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
5800 Ultra, 2900XT, (etc) are not the same thing as the GTX465. To have something comparable to the GTX465, you’d have to have the 5600 using more power than the 5800U, or the 2600XT using more power than the 2900XT.

That’s what I’m talking about here – a mid-range part using more power than a higher performing part of the same generation.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
5800 Ultra, 2900XT, (etc) are not the same thing as the GTX465. To have something comparable to the GTX465, you’d have to have the 5600 using more power than the 5800U, or the 2600XT using more power than the 2900XT.

That’s what I’m talking about here – a mid-range part using more power than a higher performing part of the same generation.

Correction......

A mid range part that could sometimes use more power then a higher performing part.

Better?

And to be honest before we pass judgement we need more data/results.
Just because 2 websites show bad results dosen't nessasarily mean its true.
They might have gotten crappy initial cards?
Might be a good idea to check more recent reviews? The newer reviews don't seem to support your arguement.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
OK , so we can put the gtx 465 ,fx5800,2900xt, and 5830 in the crap draw. :)
Imagine how much power it would take to run that draw and how loud it would be? :)
I would add the x1800.

The 5830 is a worse monster this round as its performance sucked so much and got universal disappointment from reviewers. The 465 at least performed well.

True the 460 is now a better card overall, but that's what a refresh can do.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
That’s what I’m talking about here – a mid-range part using more power than a higher performing part of the same generation.
There has always been variation within microprocessors, e.g. on average I believe a 920 uses more power than a 975 despite running at a slower clockspeed. I haven't been paying enough attention over the years to know if this applies to older video cards, but both the 465 and 5830 can use more power than their more powerful brothers.

If this huge variation on power characteristics is unique to this generation of cards I believe we can safely say it is most likely a TMSC issue or bad designs by both Nvidia and ATI. If it is the norm though, we can just chalk this up to standard manufacturing defects. Like I said though, I never paid much attention to power characteristics back a few years ago so I don't know if this is normal or not.


*edit for pictures*
21799.png

23098.png
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
[/LIST]nVidia's new binning scheme is responsible for that.
What exactly makes you sure, Nvidia wouldn't sell all chips at the highest possible voltage they do think they can sell? So far I haven't seen any argument for that..
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
ok, I see what you mean now. so an individual gtx 465 COULD be the worst card of all time, or it could simply be a cut down gtx 470. wow, I wasn't trying to insult 5800 U and 2900 xt like that.

@ happymedium, nice trick including 5830 in that "suckage" category. It is clearly a bad card, but it's more bad from a price/performance standpoint than truly historically terrible. If amd had priced it at $175 out of the gate it wouldn't have the same reputation. 2900xt and 5800 ultra were supposed to be the flagship and they made amd/nvidia look very bad, while gtx 465 just plain sucks any way you slice it.

btw, I think that going forward we will see both camps use variable voltage. When the top end parts use it, too, then it will be much harder to point the finger at the salvaged units later on and say that their power consumption is unreasonable.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Just because 2 websites show bad results dosen't nessasarily mean its true.
Sure it does. The existence of something proves it exists. That’s rather elementary logic.

They might have gotten crappy initial cards?
Right, just like anyone else could get one. That’s the point.

wow that picture shows a 5830 drawing more power then a 5850?
BFG?
You can partially explain that away by the fact the 5830 is clocked at 800 MHz vs the 5850’s core clock of 725 MHz. But yes, the 5830 is an inefficient part given it offers less performance than the 5850, while sometimes burning more power.

So what’s the GTX465’s excuse? Everything is much lesser absolutely everywhere (except core and shader clocks which are the same) than the GTX470.

There’s no justifiable reason it should use more power than a GTX470 unless a defective part is being overvolted.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
@ happymedium, nice trick including 5830 in that "suckage" category. It is clearly a bad card, but it's more bad from a price/performance standpoint than truly historically terrible. If amd had priced it at $175 out of the gate it wouldn't have the same reputation. 2900xt and 5800 ultra were supposed to be the flagship and they made amd/nvidia look very bad, while gtx 465 just plain sucks any way you slice it

I followed the 5830 very closley because I wanted to upgrade my 5750 with it.:(
It totally pissed me off with its suckage, so yes I'm a little biased with that peticular card. :)