• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

RX 480 8GB or 1060 6GB?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sapphire 480 8GB or Gigabyte Mini 1060 6GB?

  • Sapphire's 480

    Votes: 42 72.4%
  • Gigabytes's 1060

    Votes: 16 27.6%

  • Total voters
    58
It's humorous to see how you hype the RX 480 when it wins, but downplay the GTX 1060 when it does 🙄😀



Um, no. Joker actually had an AIB RX 480 vs a REFERENCE GTX 1060. He downclocked the AIB RX 480, or so he says. We have no way of verifying whether he did or not, as he didn't display any clock speeds. Either way, this was a terrible review and should be discounted for being unreliable.

.

The GTX 1060 beats the RX480 in DX11 by 2% at 1080p, and loses by 6% in DX12 and Vulkan at 1080p. However, the selection of DX12 titles are biased towards AMD. Bias is something that needs to be taken into account, because the majority of DX12 titles aren't properly optimized for NVidia.



OK, I watched this very painful video with the craptastic techno music, and this was my take:

NVidia wins:

1) Rise of the Tomb Raider
2) GTA V
3) Gears 4
4) AC Syndicate
5) Dying Light
6) Mirrors Edge

AMD wins:

1) BF4
2) Hitman
3) Doom
4) Deus Ex MD

Draw:

1) The Witcher 3 Blood and Wine

Too close to call:

1) Mafia 3

2) Forza H3

If anyone wants to disagree with that, go ahead.



NVidia isn't getting pummeled in almost all DX12 games, so I don't know where you got that remark from. As for RE7, like most games released today, it's inherently biased towards GCN due to the consoles, so NVidia has to overcome that limitation through the drivers. And as I've said before, it usually takes about 3 months before we see the final performance on any game because there will be patches and updates that will impact performance significantly.



I don't think anyone really cares about reference clocked benchmarks, but whatever..
So it means that i habe to wait 3 (!) Month till i get normal or compareable performance for an Nvidia card in new titles like RE7? Seems to be a huge advantage for NV cards...
 
The GTX 1060 beats the RX480 in DX11 by 2% at 1080p, and loses by 6% in DX12 and Vulkan at 1080p. However, the selection of DX12 titles are biased towards AMD. Bias is something that needs to be taken into account, because the majority of DX12 titles aren't properly optimized for NVidia.

http://www.hardwarecanucks.com/foru...945-gtx-1060-vs-rx-480-updated-review-23.html

And the DX-11 games selection are biased toward NVIDIA and still GTX1060 only managed +2%.

Fallout 4, The Division , Overwatch, The witcher and GTAV 5 are all NVIDIA biased titles.

Not to mention that DX-12 Rise Of The Tomb Rider and Gears of War are also biased towards NVIDIA and still RX 480 managed a +6% win in DX-12.
 
So it means that i habe to wait 3 (!) Month till i get normal or compareable performance for an Nvidia card in new titles like RE7? Seems to be a huge advantage for NV cards...

The shoe fits the other foot as well, so it's not just NVidia. It's a fact that patches and driver updates can have a huge effect on the performance of a game on hardware over time. I figured this would be self explanatory for most hardware enthusiasts.
 
Last edited:
http://www.hardwarecanucks.com/foru...945-gtx-1060-vs-rx-480-updated-review-23.html

And the DX-11 games selection are biased toward NVIDIA and still GTX1060 only managed +2%.

Fallout 4, The Division , Overwatch, The witcher and GTAV 5 are all NVIDIA biased titles.

No. DX11 titles cannot really be biased like DX12 titles can, because the DX11 API and the driver constitute a much thicker layer, which means that the developer has limited capability to target specific hardware optimizations, so the IHV gets most of the burden in that respect. This was one of the main complaints that devs had over the years, that they had limited access to the hardware.

With DX12, this changed, as now developers had much greater access to the hardware, and driver optimizations matter a lot less than they did before.

Not to mention that DX-12 Rise Of The Tomb Rider and Gears of War are also biased towards NVIDIA and still RX 480 managed a +6% win in DX-12.

And this is compared to Battlefield 1, Deus Ex MD, Quantum Break, Total War WH, Hitman and Doom which are biased towards AMD. Also, let me be clear that when I say "bias" I don't necessarily mean there are shenanigans involved. Usually it's simply due to the console factor and nothing else, and also probably because it's harder to tune for NVidia hardware to the point where GPU performance exceeds the DX11 pathway, as NVidia has much more efficient DX11 drivers than AMD.

For a PC only title like Total War WH though, it can probably be traced to developer incompetence or inexperience.
 
Look I'm as much of an AMD fan as possible, I want them to do really good, but the 1060 does generally better on DX11, that is just a fact. This includes minimums, maximums, averages and time frames. Under DX12 and Vulkan AMD does have an advantage, but its limited. Its only few games that DX12 OUTSIDE of internal benchmarks performs better.

Either way DX12 also seems to have much higher frame variance and frame times. So overall if you are about to spend money you tend to go with the known right now, rather than the unknown in the future. Sure RX 480 may win quite a bit in DX12 in 1 year, but some of the pure DX12 titles like AOTS, GOW, Forza 4 its neck and neck, so its really taking a big leap of faith, while sacrificing DX11 performance in the now and probably the intermediate future.
 
These cards are so close I wouldn't base the choice on pure price/performance.

If you have or want a freesync screen I'd get the 480, if you have multiple normal monitors I'd get a 1060, 480's draw a lot of power in idle with multiple screens.
 
No. DX11 titles cannot really be biased like DX12 titles can,

I'm sorry but that is completely false. Game engine design can be biased no matter what API is used. If its not, then explain how there are DX11 titles that AMD does amazing in, DX11 titles where Nvidia does much better, and most of which where they are even? Look at project cars, anno 2205. Terrible perf for AMD yet BF1, The Division and others run perfect fine while also looking much better and have much more going on.

Drivers aren't magical. It takes the developers optimizing their games to get the best perf. Drivers can't fix engine problems.
 
I'm sorry but that is completely false. Game engine design can be biased no matter what API is used.

I never said that game engine design couldn't be biased, so I don't even know why you're bringing this up. 😵 Of course game engines can be biased, either intentionally or unintentionally. But I was specifically talking about APIs.

If its not, then explain how there are DX11 titles that AMD does amazing in, DX11 titles where Nvidia does much better, and most of which where they are even? Look at project cars, anno 2205. Terrible perf for AMD yet BF1, The Division and others run perfect fine while also looking much better and have much more going on.

Some game engines favor one architecture more than the other. Never denied this at all. I'm talking about APIs! Stop trying to move goalposts.

Drivers aren't magical. It takes the developers optimizing their games to get the best perf. Drivers can't fix engine problems.

I agree, which is why I mentioned patches as well. Actually, that's why I think you really need about three months or more to determine which IHV has superior performance in a game, because patches can have a massive impact on the dynamic.
 
I thought it was Nvidia that had the multi-monitor power bug:

http://www.digitaltrends.com/computing/nvidia-drivers-monitors/
power_multimon.png
 
I never said that game engine design couldn't be biased, so I don't even know why you're bringing this up. 😵 Of course game engines can be biased, either intentionally or unintentionally. But I was specifically talking about APIs.



Some game engines favor one architecture more than the other. Never denied this at all. I'm talking about APIs! Stop trying to move goalposts.



I agree, which is why I mentioned patches as well. Actually, that's why I think you really need about three months or more to determine which IHV has superior performance in a game, because patches can have a massive impact on the dynamic.

You said that DX11 titles can not be biased like DX12 Titles. Yes, yes they can. I'm not the one moving goalposts.
 
Well, that graph you posted was from this past summer, about the same time frame as the link I provided about high Nvidia multi-monitor power consumption. Hopefully driver updates will have fixed both issues by now.
 
it is simply not true that the 1060 generally is faster in dx11. there are more than enough benchmarks that show them beeing equal.
 
From the testing I've seen for monitor power:

1: Both AMD and Nvidia are similar
2: AMD ramps up memory speed so uses more
3: AMD and Nvidia ramp up speed so both use more

Fury is unique here in that it doesn't need to bump up speed so it uses the least amount of all for triple monitors.

https://www.computerbase.de/2016-07...nahme-des-gesamtsystems-multi-monitor-betrieb

There are multiple monitor charts and you can show extra entries as well.
 
Last edited:
Well, that graph you posted was from this past summer, about the same time frame as the link I provided about high Nvidia multi-monitor power consumption. Hopefully driver updates will have fixed both issues by now.
It was the most recent 470/480 review on techpowerup.

Nvidia fixed the bug. On amd it's by design it seems.
 
You said that DX11 titles can not be biased like DX12 Titles. Yes, yes they can. I'm not the one moving goalposts.

You ignore the context of the conversation as usual, and focus solely on the words.. 🙄

It's clear that I was talking about APIs being used in games, and not game engines.. At any rate, AtenRa used GTA V as an NVidia biased title, and I can scarce imagine a title where the developers went out of their way as much as Rockstar did to be as unbiased as possible during development.
 
You ignore the context of the conversation as usual, and focus solely on the words.. 🙄

Ok so you were commenting on him talking about bias in those dx11 titles and then said that by using DX11 they can't be biased... But thats talking about DX11 game engines still.

Also, let me be clear that when I say "bias" I don't necessarily mean there are shenanigans involved.

Of course game engines can be biased, either intentionally or unintentionally.

AtenRa used GTA V as an NVidia biased title, and I can scarce imagine a title where the developers went out of their way as much as Rockstar did to be as unbiased as possible during development.

Confused as to why you have multiple definitions of bias. It's clear that the GTA engine favors Nvidia hardware and other engines favor AMD. How could the developers go out of their way to not be biased if its impossible for them to be biased because DX11 API titles can't be biased?
 
You ignore the context of the conversation as usual, and focus solely on the words.. 🙄

It's clear that I was talking about APIs being used in games, and not game engines.. At any rate, AtenRa used GTA V as an NVidia biased title, and I can scarce imagine a title where the developers went out of their way as much as Rockstar did to be as unbiased as possible during development.

Please post proof or evidence that supports the assertion that the API itself is biased. Lower performance on nVidia cards generally in DX12 is not proof of bias, standing alone. Lower performance on nVidia cards could be due to 1) nVidia hardware being worse at dx12, 2) nVidia drivers being worse at dx12, 3) DX12 itself being poorly suited to nVidia hardware (e.g. "bias"); or some combination of those 3. I have never seen a shred of proof that proves 3)without also being explainable through 1) and 2).

The only proof I've yet seen is Nvidia's fast-interrupt support on Pascal, which proves some part of Maxwell's trouble with DX12 is due to hardware.
 
Please post proof or evidence that supports the assertion that the API itself is biased.

Where did I say any API was biased? 😵

Lower performance on nVidia cards generally in DX12 is not proof of bias, standing alone

Indeed. But when the DX11 path is much faster than the DX12 path, what then? This clearly indicates a lack of proper optimization, which could be due to developer incompetence or lack of experience.

The only proof I've yet seen is Nvidia's fast-interrupt support on Pascal, which proves some part of Maxwell's trouble with DX12 is due to hardware.

That's asynchronous compute, which isn't even a specification for DX12. 🙄
 
That's asynchronous compute, which isn't even a specification for DX12. 🙄

I love how you guys keep saying that, yet its a basic feature of DX12 so no spec is "required". If you have basic DX12 support you are required to have async compute support.

https://msdn.microsoft.com/en-us/li...spx#asynchronous_compute_and_graphics_example

Indeed. But when the DX11 path is much faster than the DX12 path, what then? This clearly indicates a lack of proper optimization, which could be due to developer incompetence or lack of experience.

Or drivers.

Look at Vulkan, huge boost in Doom from a NV driver update. There have been big boosts in DX12 games from drivers as well, because the DX12/Vulkan drivers are still WIP by AMD and Nvidia.

Like he said:

2) nVidia drivers being worse at dx12
 
I love how you guys keep saying that, yet its a basic feature of DX12 so no spec is "required". If you have basic DX12 support you are required to have async compute support.

https://msdn.microsoft.com/en-us/li...spx#asynchronous_compute_and_graphics_example

Nowhere in that article does it say anything concerning that asynchronous compute is necessary to have "basic" DX12 support. DX12, and ANY OTHER low level API only provide access to this capability, something which GPUs have had for years to varying degrees.

Or drivers.

Look at Vulkan, huge boost in Doom from a NV driver update. There have been big boosts in DX12 games from drivers as well, because the DX12/Vulkan drivers are still WIP by AMD and Nvidia.

If it were due to drivers, then the problem would be more widespread. Instead, you see a wide variance in the DX12 performance of NVidia with some titles performing very well, and others being mediocre.
 

All those graphs are done with 60Hz monitors. I've run a 144hz monitor on my 1070, and it had to increase the core clock to 1Ghz at idle, 2d desktop. It was consuming far more than 7W...

I think many people have high refresh rate monitors these days, so these 60Hz graphs are extremely misleading to put it mildly.
 
Nowhere in that article does it say anything concerning that asynchronous compute is necessary to have "basic" DX12 support. DX12, and ANY OTHER low level API only provide access to this capability, something which GPUs have had for years to varying degrees.

Exactly, it is a new queue type that is standard in DX12. However if you don't run the tasks async then you end up heavily bottlenecking which is why oxide called it a disaster and had to disable it for Maxwell... you know the cards that were supposed to have async compute enabled and marketed that way by nvidia, and that was then going to get a driver update, and then ignored in favor of Pascal? Yeah I still remember that.

If it were due to drivers, then the problem would be more widespread. Instead, you see a wide variance in the DX12 performance of NVidia with some titles performing very well, and others being mediocre.

Yet we've seen driver updates for all games so far, both DX12 and Vulkan. And its the same thing, its down to engine design. We see AMD cards faster in some engines like the latest Titan Fall 2 and RE 7 which are DX11. Most DX12 engines are just ports from DX11, it will be a while before we see the big gains available.
 
Back
Top