• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GPU-Z 5.8 Cool Feature-ASIC Quality

notty22

Diamond Member
TechPowerUp GPU-Z 0.5.8 Released

Saw this feature referenced here and there and finally researched it. Interesting information if somewhat accurate, which it seems it is ? The test defines my second , lower gtx 460 as inferior. And that correlates to my o/c testing of it. Also the stock 3D core voltage is set higher on that card stock. .987 Vs 1.037

The next new feature is ASIC quality, designed for NVIDIA Fermi (GF10x and GF11x GPUs) and AMD Southern Islands (HD 7800 series and above), aimed at advanced users, hardware manufacturers, and the likes. We've found the ways in which AMD and NVIDIA segregate their freshly-made GPU ASICs based on the electrical leakages the chips produce (to increase yield by allotting them in different SKUs and performance bins), and we've found ways in which ASIC quality can be quantified and displayed. Find this feature in the context menu of GPU-Z. We're working on implementing this feature on older AMD Radeon GPUs.
Here is a screenshot of my two cards in sig

asicquality.png


I'm not sure , what the highest result return might be, but if people want to post and compare it should be fun/interesting.
 
Last edited:
This is from a new review over at X-bit


The ASIC Quality screenshot on the right can be evoked from GPU-Z's context menu and is individual for each graphics card and GPU. This feature has been developed for Nvidia’s Fermi (GX10x and GF11x) and AMD’s Southern Islands chips (Radeon HD 78xx and HD 79xx) and is supposed to indicate the quality of the specific GPU, in percent, based on electrical leakage data. The GPU of our sample of the card has an ASIC quality of 76.6%. The higher this number, the lower voltage the GPU needs to work at the default clock rate and the higher overclocking results you can get with it by increasing its voltage.
According to Alexey Nikolaichuk (the author of RivaTuner and MSI Afterburner), the correlation between voltage and quality is like follows:
ASIC quality < 75% - 1.1750 V;
ASIC quality < 80% - 1.1125 V;
ASIC quality < 85% - 1.0500 V;
ASIC quality < 90% - 1.0250 V;
ASIC quality &#8804; 100% - 1.0250 V.​
The GPU of our sample of the XFX card matches this correlation well enough.
 
Oh interesting. My card looks like it wont be here until Monday now *grumbles* But will be sure to check it then.
 
So they are just reading default voltage and comparing it to a range, then assigning a quality score to it based on the range?
 
At the link above for gpu-z 5.8 is a thread of questions, this was posted.

by btarunr (January 21st - 1:06 PM) - Reply
ASIC quality is derived from a hard-coded value from the GPU. Forget drivers, not even a BIOS change or soft/hard voltmods can change that measurement. On our end, we can only change the scale of measurement.

by btarunr (January 21st - 1:15 PM) - Reply
FAQ #1: ASIC quality:

Look at it this way, let's say there's an imperfect car manufacturing company, and not all the cars that come out of it have perfectly aligned and balanced wheels, and so to "increase yields", the car maker puts its cars (at the factory), through corrective studs that will align/balance out wheels.

Not all GPUs are born equal, not even in the same wafer. Some have higher electrical leakage, some have low. So to correct them in the fab, their degree of leakage are measured and corrective fuses are added to the GPU package, and correct VID set for the chip's leakage characteristics. Later, NVIDIA/AMD segregate "good" chips from "normal" based on leakage/VID, the "good" ones are put into higher bins, the "normal" ones go to reference / low-factory-OC cards, the "good" ones go to high-factory-OC cards. Multiple SKUs based on the same physical GPU are also carved out this way. "normal" ones go to lower SKUs, "good" ones go to higher SKUs.
 
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?
 
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?

I'm liking my wasted power 😛
 
I would strongly advise against calling that "quality". Leakage is both good and bad - leakage is wasted power, but it's also overclocking headroom. A card with no leakage would have no wasted power, however it also wouldn't be able to overclock. Now what's quality to you, a card that has less waste power, or a card with more overclocking headroom?

This. Judging from temps on my CPUs, leaky transistors == fast transistors.

Anyways, all this is doing is reporting the VID fused in the ASICs. Much like CoreTemp reports the VIDs for CPUs.

Calling it "ASIC quality", is quite the misnomer.
 
My GTX 480 is 50.4%. Not too sure what to make of it. It overclocks like a champ and is perfectly stable.
 
I don't know if what X-Bit is saying is correct or not though. On other forums I've seen people say that their card with a lower score overclocks better. (from people who have more than one card)
 
Last edited:
If they ever update the program to support 5xxx GPUs it will be interesting as my 5850 crashes with any clock adjustments. It won't take an overclock even if I overvolt it.
 
Back
Top