GPU-Z 5.8 Cool Feature-ASIC Quality

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
TechPowerUp GPU-Z 0.5.8 Released

Saw this feature referenced here and there and finally researched it. Interesting information if somewhat accurate, which it seems it is ? The test defines my second , lower gtx 460 as inferior. And that correlates to my o/c testing of it. Also the stock 3D core voltage is set higher on that card stock. .987 Vs 1.037

The next new feature is ASIC quality, designed for NVIDIA Fermi (GF10x and GF11x GPUs) and AMD Southern Islands (HD 7800 series and above), aimed at advanced users, hardware manufacturers, and the likes. We've found the ways in which AMD and NVIDIA segregate their freshly-made GPU ASICs based on the electrical leakages the chips produce (to increase yield by allotting them in different SKUs and performance bins), and we've found ways in which ASIC quality can be quantified and displayed. Find this feature in the context menu of GPU-Z. We're working on implementing this feature on older AMD Radeon GPUs.
Here is a screenshot of my two cards in sig

asicquality.png


I'm not sure , what the highest result return might be, but if people want to post and compare it should be fun/interesting.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This is from a new review over at X-bit


The ASIC Quality screenshot on the right can be evoked from GPU-Z's context menu and is individual for each graphics card and GPU. This feature has been developed for Nvidia’s Fermi (GX10x and GF11x) and AMD’s Southern Islands chips (Radeon HD 78xx and HD 79xx) and is supposed to indicate the quality of the specific GPU, in percent, based on electrical leakage data. The GPU of our sample of the card has an ASIC quality of 76.6%. The higher this number, the lower voltage the GPU needs to work at the default clock rate and the higher overclocking results you can get with it by increasing its voltage.
According to Alexey Nikolaichuk (the author of RivaTuner and MSI Afterburner), the correlation between voltage and quality is like follows:
ASIC quality < 75% - 1.1750 V;
ASIC quality < 80% - 1.1125 V;
ASIC quality < 85% - 1.0500 V;
ASIC quality < 90% - 1.0250 V;
ASIC quality &#8804; 100% - 1.0250 V.​
The GPU of our sample of the XFX card matches this correlation well enough.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Oh interesting. My card looks like it wont be here until Monday now *grumbles* But will be sure to check it then.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,024
526
136
So they are just reading default voltage and comparing it to a range, then assigning a quality score to it based on the range?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
At the link above for gpu-z 5.8 is a thread of questions, this was posted.

by btarunr (January 21st - 1:06 PM) - Reply
ASIC quality is derived from a hard-coded value from the GPU. Forget drivers, not even a BIOS change or soft/hard voltmods can change that measurement. On our end, we can only change the scale of measurement.

by btarunr (January 21st - 1:15 PM) - Reply
FAQ #1: ASIC quality:

Look at it this way, let's say there's an imperfect car manufacturing company, and not all the cars that come out of it have perfectly aligned and balanced wheels, and so to "increase yields", the car maker puts its cars (at the factory), through corrective studs that will align/balance out wheels.

Not all GPUs are born equal, not even in the same wafer. Some have higher electrical leakage, some have low. So to correct them in the fab, their degree of leakage are measured and corrective fuses are added to the GPU package, and correct VID set for the chip's leakage characteristics. Later, NVIDIA/AMD segregate "good" chips from "normal" based on leakage/VID, the "good" ones are put into higher bins, the "normal" ones go to reference / low-factory-OC cards, the "good" ones go to high-factory-OC cards. Multiple SKUs based on the same physical GPU are also carved out this way. "normal" ones go to lower SKUs, "good" ones go to higher SKUs.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?

I'm liking my wasted power :p
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?

This. Judging from temps on my CPUs, leaky transistors == fast transistors.

Anyways, all this is doing is reporting the VID fused in the ASICs. Much like CoreTemp reports the VIDs for CPUs.

Calling it "ASIC quality", is quite the misnomer.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
My GTX 480 is 50.4%. Not too sure what to make of it. It overclocks like a champ and is perfectly stable.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
My 7970 scored 82.1%... not sure what that'll mean to me. It's overclocking without any problems so far.
 

TC777

Member
May 12, 2005
62
0
0
I don't know if what X-Bit is saying is correct or not though. On other forums I've seen people say that their card with a lower score overclocks better. (from people who have more than one card)
 
Last edited:

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
If they ever update the program to support 5xxx GPUs it will be interesting as my 5850 crashes with any clock adjustments. It won't take an overclock even if I overvolt it.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
I don't see what this shows, the numbers change each time. Mine has gone from 40-90 range each time you try it.