Asus ROG Strix RX480 OC vs MSI Gaming GTX 1060 OC [HardOCP]

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,890
3,331
146
Their complaint that the RX480 used 100 more watts than the 1060 seems idiotic considering their OC strategy was literally to just set the RX480 volts to maximum.

My STRIX 480 runs 1400 stable at 1.15 volts and my power draw reported in GPU-z is 60-80 watts less than theirs.
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,890
3,331
146
Either they have a crap card or I have a "golden" card. I just chucked in 1500 MHz at 1.3 volts and 9000 MHz on the memory and got a stable, artifact free run of Time Spy. Look at that power usage! What a hog.

Eatmyshorts_h_.jpg


I would try more than 1500 MHz, but it seems to bounce around 1460-1500 due to hitting the (150%) power limit.

Here's a run with my normal, 100% stable OC for contrast.

NormalOCrun.jpg
 

tajoh111

Senior member
Mar 28, 2005
351
392
136
Either they have a crap card or I have a "golden" card. I just chucked in 1500 MHz at 1.3 volts and 9000 MHz on the memory and got a stable, artifact free run of Time Spy. Look at that power usage! What a hog.

Eatmyshorts_h_.jpg


I would try more than 1500 MHz, but it seems to bounce around 1460-1500 due to hitting the (150%) power limit.

Here's a run with my normal, 100% stable OC for contrast.

NormalOCrun.jpg

You do have a golden card. The number of 1500mhz rx480's running on air out there can be counted on one hand.

And even then, bench stable is not game stable.
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,890
3,331
146
You do have a golden card. The number of 1500mhz rx480's running on air out there can be counted on one hand.

And even then, bench stable is not game stable.

Yes, I know that. Hence including my current 100% stable overclock for contrast.

I feel like I should explore 1.3V and see if there is a possible stable overclock there, however.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Either they have a crap card or I have a "golden" card. I just chucked in 1500 MHz at 1.3 volts and 9000 MHz on the memory and got a stable, artifact free run of Time Spy. Look at that power usage! What a hog.

I would try more than 1500 MHz, but it seems to bounce around 1460-1500 due to hitting the (150%) power limit.

Here's a run with my normal, 100% stable OC for contrast.

Wow, nice card. You have a choice RX 480 there. ASUS has done a fantastic job with the STRIX coolers this gen.
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,890
3,331
146
Wow, nice card. You have a choice RX 480 there. ASUS has done a fantastic job with the STRIX coolers this gen.

I did repaste the card to see a 5-8C improvement in temps, though. The cooler mounting surface is very rough and the GPU die is permanantly marked after I replaced the paste with a thin layer of GC Extreme. It might be worth polishing it a little to make it smoother, but it's direct heatpipe so it's hard to tell how much material is left (Since they've already ground down the surface of the heatpipes).

During the run I wasn't afraid of the card though. More worried about the motherboard going *POP* with how high the peak power draw was. I wonder if there's an easy way to check and see if, in cases of extreme draw like this, the MB power draw is still limited and all the extra power is drawn from the PSU directly or not.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
maybe they are waiting for one that shows improvement for both amd and nvidia. good luck with that.

And what's wrong with this? I don't think people fully understand that the performance potential of DX12 swings both ways. On one hand, developers have more access to the hardware than ever before, which means they can unlock greater performance. On another, the potential for catastrophic screw ups is also greatly increased as well..

Take Total War Warhammer for instance. This game has a large increase for AMD in DX12, but for NVidia it results in a large decrease.

However, NVidia has higher performance under DX11, than AMD does under DX12. How is this possible?

Because, most developers don't have a complete grasp of how to fully utilize DX12 with both hardware vendors. And to top it off, NVidia's DX11 driver is so efficient that it already mimics some of the functions in DX12, making it harder for developers to increase performance beyond DX11 for NVidia.

With AMD on the other hand, their DX11 driver is nowhere near as efficient as NVidia's, although to be fair, they've made great strides recently.

Warhammer_01.png
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Most people don't overclock highly, or even at all.

This is true, but the target audience that are buying elite class video cards with extensive cooling and power draw tuning are not exactly "most people." The people that buy this kind of hardware are more likely to be hardware enthusiasts, and not just gamers..
 

Sven_eng

Member
Nov 1, 2016
110
57
61
And what's wrong with this? I don't think people fully understand that the performance potential of DX12 swings both ways. On one hand, developers have more access to the hardware than ever before, which means they can unlock greater performance. On another, the potential for catastrophic screw ups is also greatly increased as well..

Take Total War Warhammer for instance. This game has a large increase for AMD in DX12, but for NVidia it results in a large decrease.

However, NVidia has higher performance under DX11, than AMD does under DX12. How is this possible?

Because, most developers don't have a complete grasp of how to fully utilize DX12 with both hardware vendors. And to top it off, NVidia's DX11 driver is so efficient that it already mimics some of the functions in DX12, making it harder for developers to increase performance beyond DX11 for NVidia.

With AMD on the other hand, their DX11 driver is nowhere near as efficient as NVidia's, although to be fair, they've made great strides recently.

Nvidia's DX11 optimizations are really good which is why they don't see any benefits in DX12. With that being the case, maybe Nvidia has no incentive to improve DX12 drivers while it is also seen to benefit AMD? There are games where even AMD cards don't benefit from DX12 but this usually changes after some patches and drivers. I have tested most DX12 games and now most are equal or better in DX12 for AMD. Even Tomb Raider is much better in DX12 for AMD but in all cases Nvidia's DX11 is better. They are not trying because it does not suit their agenda.

There still has been no true DX12 or Vulkan game though.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
This is true, but the target audience that are buying elite class video cards with extensive cooling and power draw tuning are not exactly "most people." The people that buy this kind of hardware are more likely to be hardware enthusiasts, and not just gamers..

Since when is the $150-$300 segment hardware enthusiasts? Heck people don't even OC their 980 Ti / Titans / 1070s. If you have any sources that say most people buying these cards OC them, please feel free to post it.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Their complaint that the RX480 used 100 more watts than the 1060 seems idiotic considering their OC strategy was literally to just set the RX480 volts to maximum.

My STRIX 480 runs 1400 stable at 1.15 volts and my power draw reported in GPU-z is 60-80 watts less than theirs.

I can see that you didn't read the entire article. They also tested the overclock with stock voltage, and they got 1360.. Obviously you have a golden sample, as I haven't seen any reviewer hit the speeds that you have, with or without additional voltage.

And as it's been mentioned, benchmarks and actual games are two different species.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Even Tomb Raider is much better in DX12 for AMD but in all cases Nvidia's DX11 is better.

Not true. Ashes of the Singularity runs faster in DX12 on NVidia. The Division also runs faster in DX12 assuming there is a CPU bottleneck..

There still has been no true DX12 or Vulkan game though.

Wrong. Gears 4 is a pure DX12 title that is excellently optimized for both IHVs..
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Since when is the $150-$300 segment hardware enthusiasts? Heck people don't even OC their 980 Ti / Titans / 1070s. If you have any sources that say most people buying these cards OC them, please feel free to post it.

So you have to spend a certain amount of money to be considered a hardware enthusiast? That makes no sense. To me, a hardware enthusiast is someone that likes to tune and optimize their computer hardware, mostly for performance, but it could also be for power usage or noise suppression as well..

XabanakFanatik to me is a hardware enthusiast, because he is attempting to squeeze additional performance out of his GPU and is obviously experienced at doing so. To you, he isn't because he has a midrange GPU o_O
 
  • Like
Reactions: Face2Face

Krteq

Golden Member
May 22, 2015
1,010
730
136
Gears 4 is a pure DX12 title that is excellently optimized for both IHVs..
Once again... Yes, GoW4 is a "DX12 title", but it's still using DX11 resource management (UE4), it's note "pure DX12 title" at all.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
I don't know how he managed those clocks on the 1060, I got the same MSI version and memory is max at 8.8GHz and I can push the core up to 2000mhz, but even that is pushing the thermals and power a ton, more than that and I can take a fast 30 second benchmark, but its not stable for over 2 minutes, it eventually freezes the operating system.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Once again... Yes, GoW4 is a "DX12 title", but it's still using DX11 resource management (UE4), it's note "pure DX12 title" at all.

And once again:

Mike Rayner: Gears of War 4 is a DirectX 12 title on both Xbox One and Windows 10. Working with Microsoft's Silicon, Graphics and Media team and Epic, we have transitioned Unreal Engine fully to DX12. DirectX 12 has allowed us to increase performance by giving us more direct control over the hardware, simplifying the driver layer, and allowing us to make fully informed and optimal decisions on how to manage graphics resources.

Source

Gears 4 is a pure DX12 title. Microsoft invests heavily in Epic and the Unreal Engine platform, so it makes sense that there would be a great deal of collaboration between Epic and Microsoft with Gears 4. I have Gears of War 4, and it's fairly obvious the game isn't using DX11 resource management due the fact that it uses asynchronous compute very well, exploits parallel rendering and makes efficient use of the VRAM..
 

Krteq

Golden Member
May 22, 2015
1,010
730
136
Nope, read a UE4 docu ;)

There is an official UE4 Roadmap. As you can see there, DX12 for PC and XBONE is still in "ongoing" state. Anyway, DX12 resource management is still not implemented.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
:rolleyes:

Please provide your proof that most people who buy 1060/480 GPUs OC them heavily.

This is a goalpost shift. I never originally said that people who buy 1060/480 GPUs overclock them heavily. I said that people that buy premium versions of these GPUs are more likely to engage in overclocking. For example, the Zotac Amp Extreme series and the MSI Lightning are designed for, and marketed towards overclockers.. Same thing with the ROG series from Asus.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't know how he managed those clocks on the 1060, I got the same MSI version and memory is max at 8.8GHz and I can push the core up to 2000mhz, but even that is pushing the thermals and power a ton, more than that and I can take a fast 30 second benchmark, but its not stable for over 2 minutes, it eventually freezes the operating system.

You must have gotten a dud then, because most GTX 1060s can hit around 2.1ghz +, as seen in this Reddit thread.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Nope, read a UE4 docu ;)

There is an official UE4 Roadmap. As you can see there, DX12 for PC and XBONE is still in "ongoing" state. Anyway, DX12 resource management is still not implemented.

An "official" UE4 roadmap, on t-rello.com? o_O

Anyway, I looked at the (Ongoing) DX12 support for PC and Xbox One card, and the last update was at the beginning of the year. It's possible that the DX12 improvements that Epic and Microsoft implemented in Gears 4 haven't made it into the base engine yet, but from what I've read, Gears 4 is a proper DX12 game that puts most of the hybrid DX11/DX12 games to shame..
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
I like how the OP managed to turn the review into a negative for the RX480.

The HardOCP summary was quite positive:

ASUS has done a tremendously good job at engineering a robust video card that can take the AMD Radeon RX 480 GPU to new heights. The ASUS ROG STRIX RX 480 O8G GAMING is capable of a high voltage setting, and it works, with the combined excellent cooling of DirectCU III and the Power Limit slider we are able to experience a high consistent clock speed overclocked. The DirectCU III cooling solution works very well, keeping the GPU at cool temperatures with no noise. Even overclocked, where we manually increased the fan speed the noise level was more than tolerable and the temps were excellent. You could probably lower the fan speed from what we had set with the overclock achieved.



As we mentioned, this is the highest overclock we’ve achieved yet with any AMD Radeon RX 480 GPU. The overclock has improved performance over the card’s default out-of-box clock speeds as well as a reference AMD Radeon RX 480 by a great deal. The overclock provides a real, tangible benefit to the gameplay experience. With the overclock the ASUS ROG STRIX RX 480 O8G GAMING video card is very good competition to highly overclockable GeForce GTX 1060 video cards as we have shown.



There is an MSRP of $299 on the ASUS ROG STRIX RX 480 O8G GAMING video card, however, you do get a lot of potential and customized hardware with that price. Thankfully however there are price savings and rebates currently making this video card much less expensive. It can be had at $259.99 after $20 MIR at both Amazon and Newegg.



At $260 this video card competes well with the likes of highly overclockable GeForce GTX 1060 video cards like we have compared to in this evaluation. At $260 you know you’ll be getting a video card that has what it takes to push the AMD Radeon RX 480 GPU to its limits in performance.



If you are looking for one of the best AMD Radeon RX 480 GPU based video cards out there this holiday season, the ASUS ROG STRIX RX 480 O8G GAMING should be on your short list.

It got a Gold award.
 

kawi6rr

Senior member
Oct 17, 2013
567
156
116
I like how the OP managed to turn the review into a negative for the RX480.

The HardOCP summary was quite positive:



It got a Gold award.
When I saw who the OP was I gave it a big o_O and a grain of salt. Figured this would turn out to be a green goggle troll thread.


Insulting other members is not allowed
Markfw
Anandtech Moderator
 
Last edited by a moderator:
Status
Not open for further replies.