blastingcap
Diamond Member
- Sep 16, 2010
- 6,654
- 5
- 76
Multimonitor is 6W for the GTX 1060 vs. 40W for the RX 480. https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/24.html Idle and gaming load figures also available at that link (5 vs 15W; 116 vs 163W). Let's say the average difference is a mere 20W over 5 hours per day in a combination of idle/3D gaming time (ignore multimonitor altogether for now). That's 0.1kWh x 365 = 36.5 kWh per year. At 20 cents per kWh, that's $7.30/year. Some jurisdictions have higher or lower rates, and it depends on what tier you are on, but I'm paying about 20 cents/kWh in CA under my current tiered rate.
So if you keep the RX 480 for 2 years vs. the GTX 1060 for 2 years, the RX 480 costs you $14.60 more in electrical costs. Since OP is assuming equal initial pricing, this effectively make the RX 480 $14.60 more expensive.
It's even worse if you're on multimonitor like I am because now idle power on the GTX 1060 goes from 5W to 6W idle, and AMD goes from 15W to 40W... that's just... wtf. I mean we KNOW AMD can fix this, Fury X draws 21W and Fury 11W. How the heck did 11 or 21W go to 40W? Somehow AMD went BACKWARDS in idle and multimonitor power draw with the RX480, compared to Fury. This is why I said it's shameful, we know AMD can do better but they just shrugged and gave us worse idle and multmonitor power efficiency than Fury anyway.
I already addressed FreeSync above by saying if it matters to you, it can be a dealmaker. The problem for most people is that they don't have FreeSync monitors. Many of us won't buy crappy $140 FreeSync monitors no matter what. We already have high-end IPS monitors that we don't want to or can't replace until necessary. The target market for the 1060/480 is for more cost-conscious gamers, after all, not like the guys who are fine paying $$$ for TITANs or something who can presumably afford to buy new monitors as well. Also, some people like my wife game on the TV.
GTX 1060 has more OC headroom than RX 480. I encourage skeptics to look at a wide range of reviews on this, including Tom's which has curves for their sample.
So if you keep the RX 480 for 2 years vs. the GTX 1060 for 2 years, the RX 480 costs you $14.60 more in electrical costs. Since OP is assuming equal initial pricing, this effectively make the RX 480 $14.60 more expensive.
It's even worse if you're on multimonitor like I am because now idle power on the GTX 1060 goes from 5W to 6W idle, and AMD goes from 15W to 40W... that's just... wtf. I mean we KNOW AMD can fix this, Fury X draws 21W and Fury 11W. How the heck did 11 or 21W go to 40W? Somehow AMD went BACKWARDS in idle and multimonitor power draw with the RX480, compared to Fury. This is why I said it's shameful, we know AMD can do better but they just shrugged and gave us worse idle and multmonitor power efficiency than Fury anyway.
I already addressed FreeSync above by saying if it matters to you, it can be a dealmaker. The problem for most people is that they don't have FreeSync monitors. Many of us won't buy crappy $140 FreeSync monitors no matter what. We already have high-end IPS monitors that we don't want to or can't replace until necessary. The target market for the 1060/480 is for more cost-conscious gamers, after all, not like the guys who are fine paying $$$ for TITANs or something who can presumably afford to buy new monitors as well. Also, some people like my wife game on the TV.
GTX 1060 has more OC headroom than RX 480. I encourage skeptics to look at a wide range of reviews on this, including Tom's which has curves for their sample.
Last edited:
