Geforce GTX 1060 Thread: faster than RX 480, 120W, $249

Page 99 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
131
Which graphics card should you buy for 1080p60 gameplay?

We'd be quite happy with either the Radeon RX 480 8GB [?] or GeForce GTX 1060 6GB [?] for our build, but titles such as No Man's Sky do still seem to suggest that AMD's single-threaded OpenGL and DX11 drivers can still hold back the generally excellent hardware in certain titles. Combine Nvidia's driver advantage here with its small speed bump and right now, we'd be opting for the higher-specced GTX 1060 as the GPU of choice.

http://www.eurogamer.net/articles/digitalfoundry-2016-lets-build-a-1080p60-gaming-pc



Nvidia GeForce GTX 1060 3GB vs 6GB review

Untit4544led-2_zpsghfyqegm.jpg


In almost all cases, the budget Nvidia offering outperforms the RX 470 (and remember, this will be boosted by a couple of frames owing to its factory OC) along with both iterations of the RX 480. Even Ashes of the Singularity - a weak point for Nvidia - sees it offer basic parity across the run of the clip. In all of the benches here, there is no evidence to suggest that the 3GB framebuffer causes issues - except in one title, Hitman. Here, the performance drop-off on GTX 1060 3GB is significant - and we strongly suspect that DX12, where the developer takes over memory management duties, sees the card hit its VRAM limit.

Re-benching a few titles, the impact of the overclock becomes clear - we're looking at a 12 per cent increase in performance over the GTX 1060's stock configuration, dropping to eight per cent when compared to the factory OC we have with the MSI Gaming X's set-up. This isn't exactly a revelatory increase overall - the days of 20 per cent overclocks with the last-gen 900-series Maxwell cards are clearly over. However, it is enough to push the cut-down GP106 clear of the fully enabled version when we're not limited by VRAM.

Going back to our GTX 1080 review, we were pleasantly surprised to see how well the old GTX 780 Ti held up on our modern benchmarking suite bearing in mind its 3GB of VRAM. The new GTX 1060 3GB has the same amount of memory but an additional two generation's worth of memory compression optimisations - the end result is that three gigs is indeed enough for top-tier 1080p60 gameplay - as long as you stay away from memory hogs like MSAA (which tends to kill frame-rate) along with 'HQ/HD' texture packs and extreme resolution texture options. By and large, the visual impact of these options at 1080p is rather limited anyway - generally speaking, they're designed for 4K screens.

http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1060-3gb-vs-6gb-review
 
  • Haha
Reactions: Grazick

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
So another DX11 game with a forced and broken DX12 implementation that devs never wanted to do... what a news.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Nope, they developed the game for DX11 and they are now looking how to cram DX12 in it, because AMD paid them to do so. Time that could be expend doing other things.

Where is your proof of any of this?

DICE has wanted a low level API for a long time, thats why they developed Mantle with AMD.

The head engineer for Frostbite said he wanted DX12 only a long time ago, but knew that couldn't happen.

They'd rather drop DX11 support and focus solely on low level APIs like Vulkan + DX12 if possible.

Would like to require Win10 & DX12/WDDM2.0 as a minspec for our holiday 2016 games on Frostbite, likely a bit aggressive but major benefits

https://twitter.com/repi/status/585556554667163648

Its clear the DX12 path isn't fully optimized yet, its BETA and releases in 2 months.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
So another DX11 game with a forced and broken DX12 implementation that devs never wanted to do... what a news.

Erm this is a DICE game running on the Frostbite engine. The technical director for frostbite engine is Johan Andersson, who previously said this:

Would like to require Win10 & DX12/WDDM2.0 as a minspec for our holiday 2016 games on Frostbite, likely a bit aggressive but major benefits

So the developers absolutely wanted to do this. In fact no one has been more positive about Mantle/DX12/Vulkan than Dice/Johan Andersson (Johan Andersson had a big hand in the development of said APIs)

Edit: Ninja'ed by Bacon1 :)
 
  • Like
Reactions: Headfoot and Bacon1

zinfamous

No Lifer
Jul 12, 2006
111,238
30,211
146
So another DX11 game with a forced and broken DX12 implementation that devs never wanted to do... what a news.

So a game that isn't even out for another month now has some broken and forced implementation. Well, isn't that a new precedent!

You should probably go ahead and demand your refund on your pre-order.
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
No, the game runs now like crap on DX12 and fine on DX11, this means it was developed to use DX11 and they dont cared about DX12 until last moment and they are now trying to port their DX11 work intro DX12 /period.

I dont care about made up excuses like those.

BTW, Unity developers also said a lot of nice things about DX12 and their DX12 work on the engine is just pure crap.
 

zinfamous

No Lifer
Jul 12, 2006
111,238
30,211
146
No, the game runs now like crap on DX12 and fine on DX11, this means it was developed to use DX11 and they dont cared about DX12 until last moment and they are now trying to port their DX11 work intro DX12 /period.

I dont care about made up excuses like those.

BTW, Unity developers also said a lot of nice things about DX12 and their DX12 work on the engine is just pure crap.

I think you need to reread the more recent posts about Dice, their developers, and their overall goals with the Frostbite engine, and accept that your perspective on this matter is both baseless and fantastical. If you can't accept that, then you shouldn't bother making this asinine point about something that you clearly do not understand.

but again...why does it matter to you and to this thread which is about the 1060 vs 480? What does DX12 have to do with anything here? Shouldn't you want games to be moving forward with updated options, rather than sitting in the past?
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Wow,it seems with the newer games tested,the difference between the RX480 and GTX1060 has shrunk.

This is like what started to happen with my GTX960. Sigh.

Plus,a Fury X is within a few percent of a GTX980TI at 1080P!! It used to get thrashed at 1080P by a GTX980TI.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136

You forgot to fully quote the entire summary:

Nvidia GeForce GTX 1060 3GB - the Digital Foundry verdict

Going back to our GTX 1080 review, we were pleasantly surprised to see how well the old GTX 780 Ti held up on our modern benchmarking suite bearing in mind its 3GB of VRAM. The new GTX 1060 3GB has the same amount of memory but an additional two generation's worth of memory compression optimisations - the end result is that three gigs is indeed enough for top-tier 1080p60 gameplay - as long as you stay away from memory hogs like MSAA (which tends to kill frame-rate) along with 'HQ/HD' texture packs and extreme resolution texture options. By and large, the visual impact of these options at 1080p is rather limited anyway - generally speaking, they're designed for 4K screens.
Would gaming be better without saves? Heretical visions for a better future.
That said, as good as Nvidia's compression technology is, it is lossless in nature, meaning that its effectiveness won't just change on a title by title basis, but at a per-scene level too, according to the content. And with the Hitman benchmark suggesting that even at 1080p, we might be hitting the three gig limit and seeing an additional hit to performance not caused by the reduced CUDA core count, we do have to wonder about the level of future-proofing this cut-down GTX 1060 has. The visual improvement found in super high resolution texture packs may be limited, but we certainly wouldn't want to drop down to medium quality artwork on future titles in order to sustain the expected level of performance.

In the here and now, the three gig GTX 1060 is a good card with excellent performance at its £189/$199 price-point, but its VRAM allocation may well hit its limits more quickly than the four gigs found in the RX 470/480. None of the new wave of sub-£200/$200 graphics cards should be entirely ruled out, and this pared back GTX 1060 still packs plenty of punch - but investing just a little extra in the GTX 1060 6GB would be our recommendation. With certain six gig versions retailing under the initial suggested price-point, grabbing the more capable model needn't break the bank.

They also noticed things like the following with the latest Tombraider:

http://images.eurogamer.net/2015/ar....png/EG11/resize/600x-1/quality/80/format/jpg

jpg
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
131
Wow,it seems with the newer games tested,the difference between the RX480 and GTX1060 has shrunk.

This is like what started to happen with my GTX960. Sigh.

Only because they included Doom Vulkan and tested NVIDIA cards with an older driver (372.54 WHQL instead of 372.70 WHQL). We already know what happens with the latest:

http://cdn.sweclockers.com/artikel/diagram/12064?key=182fcf4d959927d230f7f9552da13d38

A driver update was able to negate the advantage of GCN shader optimizations + exclusive Async Compute. I guess Pascal is not so bad on low-level APIs after all. :p
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Only because they included Doom Vulkan and tested NVIDIA cards with an older driver (372.54 WHQL instead of 372.70 WHQL). We already know what happens with the latest:

http://cdn.sweclockers.com/artikel/diagram/12064?key=182fcf4d959927d230f7f9552da13d38

Exciting to think that a driver update negates the advantage of GCN shader optimizations + exclusive Async Compute for one vendor. I guess Pascal is not so bad after all. :)

Nope,since there were improvements all round too - but I suppose people like you kept saying the same for Maxwell and now my GTX960 is increasingly being beaten by an R9 380. Yet to see all these updates for async and Vulkan. Still waiting after the GTX1060 launch. LMAO.

12062


12063



You come across a tad optimistic mate - look at the OpenGL and Vulkan scores on the GTX1060. They are basically the same.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
131
Nope,since there were improvements all round too - but I suppose people like you kept saying the same and now my GTX960 is increasingly being beaten by an R9 380. Yet to see all these updates for async and Vulkan. Still waiting after the GTX1060 launch. LMAO.

Nope, it was mostly because of Doom Vulkan, yet the difference is down only a bit (using outdated pre-372.70 results for NVIDIA).

You come across a tad optimistic mate - look at the OpenGL and Vulkan scores on the GTX1060. They are basically the same.

Actually now I see that test was on the older 372.54 as well. So GTX 1060 6GB was beating its competitor at 1440p and 4K without the improvements from 372.70? Thanks for pointing it out.
 
Last edited:

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Only because they included Doom Vulkan and tested NVIDIA cards with an older driver (372.54 WHQL instead of 372.70 WHQL). We already know what happens with the latest:

http://cdn.sweclockers.com/artikel/diagram/12064?key=182fcf4d959927d230f7f9552da13d38

A driver update was able to negate the advantage of GCN shader optimizations + exclusive Async Compute. I guess Pascal is not so bad on low-level APIs after all. :p

Nope, it was mostly because of Doom Vulkan, yet the difference is down only a bit (using outdated pre-372.70 results for NVIDIA).



Actually now I see that test was on the older 372.54 as well. So GTX 1060 6GB was beating its competitor at 1440p and 4K without the improvements from 372.70? Thanks for pointing it out.

Nope,because the Tombraider scores in the TechPowerUp review look better for the RX480:

https://www.techpowerup.com/forums/threads/msi-gtx-1060-gaming-x-3gb.225481/page-2#post-3517976

With the newer games test,the difference has decreased,and using your argument about games which prefer AMD,some of the games like Anno2205 and WoW run on Nvidia far better(should know having friends who play those games)

Plus,the scores for the GTX1060 are virtually the same for both OpenGL and Vulkan - now you are shifting your argument for some reason. You said it was the latest,now you are shifting to "but its not the latest" since you didn't even look at the graph and I don't know a single GTX1060 or RX480 owner in real life who uses those cards at 4K,and 1080P is still the most common resolution! ;)

You do realise everything is not an AMD vs Nvidia war right?? Four of my last five fastest cards have been Nvidia.

PS:

I did check out the purported test in the other threads with these magical drivers you said.

The RX480 still seemed faster.

Plus I have given up on a Vulkan update for Maxwell - so I will have to use OpenGL on my current card.
 
  • Like
Reactions: Bacon1

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
131
Plus,the scores for the GTX1060 are virtually the same for both OpenGL and Vulkan

Wrong, there are improvements with 372.70:

https://forums.anandtech.com/thread...kan-api-explored.2482691/page-3#post-38451300
https://forums.anandtech.com/thread...kan-api-explored.2482691/page-4#post-38453021

GTX 1060 6GB also wins at BF1 Beta:

https://forums.anandtech.com/thread...rx-480-120w-249.2478605/page-99#post-38446302

So it remains the faster card overall, and seems to have an edge in one of (if not the) most important title of the year (as of right now).

now you are shifting your argument for some reason. You said it was the latest,now you are shifting to "but its not the latest" since you didn't even look at the graph and I don't know a single GTX1060 or RX480 owner in real life who uses those cards at 4K,and 1080P is still the most common resolution! ;)

Maybe not 4K but 1440p was faster on GTX 1060 6GB as well in SweClockers, and that's before the Vulkan improvements (372.70).
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Wrong, there are improvements with 372.70:

https://forums.anandtech.com/thread...kan-api-explored.2482691/page-3#post-38451300
https://forums.anandtech.com/thread...kan-api-explored.2482691/page-4#post-38453021

GTX 1060 6GB also wins at BF1 Beta:

https://forums.anandtech.com/thread...rx-480-120w-249.2478605/page-99#post-38446302

So it remains the faster card overall, and seems to have an edge in one of (if not the) most important title of the year (as of right now).



Maybe not 4K but 1440p was faster on GTX 1060 6GB as well in SweClockers, and that's before the Vulkan improvements (372.70).

Emm,you did read what he said from that first link:

I was testing with my 750TI, and 372.70 definatelly gets better perf, still OpenGL performance is better.

Same as my GTX960 - so why are you trying to twist things to sound like Maxwell is getting improvements. Why are you making excuses - I have a GTX960 and I have Doom. OpenGL is still better or virtually the same.

I already finished the game too - its been months since it was released.

The second link is for a GTX1080??? Its not a GTX1060.

I am not sure why you conveniently missed this later in another thread:

http://i.imgur.com/lTQd7Kr.png

lTQd7Kr.png


Hitman928 said:
Here are my results. Latest nvidia driver definitely fixed Vulkan synch issue so now Vulkan is the preferred API for both cards. I'll have to do 1440p at some point as well.

That is with the RX480 reference card against a custom GTX1060 and was run yesterday.

Its not all that inspiring since Nvidia showed off Doom running on a GTX1080 using Vulkan months before AMD did! ;)

Plus,good to see Nvidia doing well in the BF1 Beta,but will need to wait and see what happens when the final release happens.

I remember someone saying there was a new AMD BF1 driver released after that review - but still fail that AMD is lagging in that regard.
 
Last edited:
  • Like
Reactions: Bacon1

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Wrong, there are improvements with 372.70

You keep saying that .70 fixed it, but it sounds like it was fixed in the earlier 372.54, considering the first one to point it out was the sweedish review you linked above, using the prior driver.

Swedish overclockers of Doom Vulkan vs OpenGL using the new NVidia 372.xx drivers which fixed the poor performance in Vulkan

Saying it all comes down to drivers is insulting, really shows lack of understanding of the situation here. They were probably 'asked' to release this Vulkan Update before GTX 1060's launch, ready or not. Wouldn't surprise me if future updates (months from now) improve Pascal's performance.

Funny that you are now all gung ho for Doom when you previously said that id rushed the Vulkan release and it was impossible for drivers alone to fix it...

Guess Nvidia's driver was just poor after all.
 
  • Like
Reactions: Grazick

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Well in February this year Nvidia said the following:

https://blogs.nvidia.com/blog/2016/02/16/vulkan-graphics-api/

The Vulkan Graphics API Is Here—and Your NVIDIA GPU Is Ready

I assume that meant Maxwell,and apparently no,Nvidia,my card is not ready.

Yet,they showed off the GTX1080 running Doom using Vulkan in May this year. Only taken them 4 months and I thought Bethesda and iD were generally closer to Nvidia too,looking at Fallout 4 and games like RAGE.

I was wondering when Nvidia would release the async and Vulkan updates for Maxwell - apparently that update was probably meant to be Pascal.
 
  • Like
Reactions: Bacon1

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
131
You keep saying that .70 fixed it, but it sounds like it was fixed in the earlier 372.54, considering the first one to point it out was the sweedish review you linked above, using the prior driver.

This is the third time you quote this post from another thread. Are you trying to prove a point or maybe just going a bit too far on your personal vendetta? I said it didn't come down to drivers exclusively, but didn't rule out their importance. And this still doesn't explain the exclusive GCN Shader Optimizations + Async Compute and the suspicious release right before GTX 1060's launch. ;)
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
A driver update was able to negate the advantage of GCN shader optimizations + exclusive Async Compute. I guess Pascal is not so bad on low-level APIs after all.

And this still doesn't explain the exclusive GCN Shader Optimizations + Async Compute

So which is it? Are those optimizations and async compute so special, or not? Obviously all it needed was proper drivers from Nvidia to negate them, so why does them being included matter at all?

If Nvidia had those drivers out when the 1060 launched, there would be no issue right?

Also the 1080 does appear to do work using async compute in doom:

http://imgur.com/a/hbX8t

So there goes that theory.. guess async compute missing was driver related after all as well.

Should we bring up all of the Nvidia specific OpenGL extension methods that companies use?