I recently ebay'd a 1080 or two or three for an average of $428 US each, and was shocked at my ASIC quality as reported by GPUz. EDIT: Apparently GPUz can't accurately read the ASIC score from a 10-series card yet, so much of this is now void. I am accustomed to seeing ASIC quality in the 57-88% range, but my first Founder's edition was 97.6%! Then the second Founder's edition arrived, and it was 100%. Wow. The third card was not a Founder's edition, and it is 60.2%. 1080s: 97.6% Dell/Alienware Founder's ed., 100%, EVGA Founder's ed., 60.2% EVGA base model. 1070's: 60.2%, 60.2%, and 60.2%. Hmmm. All EVGA, one is "superclocked" so I expected better. R9-280X's: 57.1% and 57.9%. MSI brand. Note: The latest version of GPUz doesn't (or I haven't yet found out how to) show the ASIC quality, but version 0.8.5 does. Open it, and select the card (if you have multiple cards) then right click on the title bar and choose "read ASIC quality". So, purely out of curiosity, what is your ASIC score, card model, and what is the brand? Does it even matter? Supposedly the higher the score, the lower the power usage, and the closer the chip is to being a perfect sample.