• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Next-gen API benches for RX 480 & 1060.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PhonakV30

Senior member
Oct 26, 2009
977
347
136
I've seen in 1060 reviews that nvidia is ready to bite their finger off if that means amd looses an arm. They can sacrifice a bit of performance going from vulkan to OGL, just to make amd cards suffer more performance loss.

They will kill amd with tesselation and render previous nv gens obsolete with preemption.
They can't kill RX 480 with Tess

 

fingerbob69

Member
Jun 8, 2016
38
9
36
I don't think they know how yet. They simply write the code as specified in Intel's and Khronos' programming guide and it works on AMD. That's exactly what Oxide said they did. They activated it on nVidia too but it created performance issues. So, they were forced to turn it off.
This. Combined with that Tess. bench just posted you hear again and again that the 480 got nVidia shit-scared. Their whole 10xx launch to date ...just seems so rushed. Even more so when you throw in Titan P for August.

Vega, the GTX killer, in October? Or just the whole emerging DX12/Vulcan scenario?
 

tonyfreak215

Senior member
Nov 21, 2008
274
0
76
This. Combined with that Tess. bench just posted you hear again and again that the 480 got nVidia shit-scared. Their whole 10xx launch to date ...just seems so rushed. Even more so when you throw in Titan P for August.

Vega, the GTX killer, in October? Or just the whole emerging DX12/Vulcan scenario?
I agree. It's interesting that Nvidia would release the Titan so early; seems like a cash grab to me.
 

Yakk

Golden Member
May 28, 2016
1,574
272
81
Something tells me Battlefield 1 will also be using Async + Intrinsic Shaders on DX12 like DOOM does.



Should be an interesting release.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Something tells me Battlefield 1 will also be using Async + Intrinsic Shaders on DX12 like DOOM does.
Has dx12 actually been confirmed or is there still a small chance for vulkan?

I remember a tweet from repi saying they were still on the fence, but was a long time ago.
 

Yakk

Golden Member
May 28, 2016
1,574
272
81
Has dx12 actually been confirmed or is there still a small chance for vulkan?

I remember a tweet from repi saying they were still on the fence, but was a long time ago.
DX12 was confirmed as far as I know. Though honestly it looks more to be a political decision since DICE pretty much created Vulkan in large part with AMD (and by extension DX12, but that's for another thread). So creating a Vulkan version for them should be relatively quick and easy.

However I'm sure Microsoft has EA onboard for using DX12.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,013
91
DX12 was confirmed as far as I know. Though honestly it looks more to be a political decision since DICE pretty much created Vulkan in large part with AMD (and by extension DX12, but that's for another thread). So creating a Vulkan version for them should be relatively quick and easy.

However I'm sure Microsoft has EA onboard for using DX12.
I'm guessing that DICE will be phasing out DX11 support, and replacing it with Vulkan.

So Vulkan + DX12 engine, might be a year out though or maybe they will replace it with updates to BF1.
 

parvadomus

Senior member
Dec 11, 2012
685
14
76
They can't kill RX 480 with Tess

What really hurts AMD is how ahead Nvidia is with their texture compression. That's the real key behind the lower power efficiency.
If AMD manages to get a similar compression technology they will be pretty sure on par with power consumption.
 

Yakk

Golden Member
May 28, 2016
1,574
272
81
I'm guessing that DICE will be phasing out DX11 support, and replacing it with Vulkan.

So Vulkan + DX12 engine, might be a year out though or maybe they will replace it with updates to BF1.
If it were only up to DICE, they would relegate DX11 to legacy status THIS YEAR.

Would like to require Win10 & DX12/WDDM2.0 as a minspec for our holiday 2016 games on Frostbite, likely a bit aggressive but major benefits

— Johan Andersson (@repi) April 7, 2015

@EVGA_JacobF indeed! get rid of the Windows legacy and reap the benefits of a modern graphics API & memory management (WDDM 2.0)

— Johan Andersson (@repi) April 7, 2015
DICE's Frostbyte engine has been DX12 ready for a long time now.

We know AMD sure aren't holding anybody back to use DX11 either!! ;)
 

Mikeduffy

Member
Jun 5, 2016
27
18
46
I guess it's same to assume that the 480 will beat the 1060 in most - if not all - MS XBone ports. So, the 480 will most likely be the leader in the newer APIs.

Little off topic - I haven't heard much about the Primitive Discard Accelerator at all. Does anyone know how it's performing? Does it work in dx12/dx11? Is it something that will need specific optimizations for?
 
Feb 19, 2009
10,458
5
76
Going DX12 only makes zero sense. What about Win 7 & 8 users? Linux? OSX?

Go Vulkan and target them all. Don't know why these publishers do not see that, id Software has laid the foundation for them, show how it's done. Wasting time with DX11, DX12 then porting to OpenGL for Linux & OSX... ew.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,013
91
Going DX12 only makes zero sense. What about Win 7 & 8 users? Linux? OSX?

Go Vulkan and target them all. Don't know why these publishers do not see that, id Software has laid the foundation for them, show how it's done. Wasting time with DX11, DX12 then porting to OpenGL for Linux & OSX... ew.
Well there is lower hardware entry point, but I think they will by next year.
 

A_Skywalker

Member
Apr 9, 2016
76
4
71


Ashes of The Singularity. 1080p crazy settings.

RX 480 with i5 2400 dx 12

DX 12 improved my average fps from 26 to 34. But mainly because I have weak cpu. I almost got to the normal score which is shown in the 1st page of this forum. I also use PCI E 2.0
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106


Ashes of The Singularity. 1080p crazy settings.

RX 480 with i5 2400 dx 12

DX 12 improved my average fps from 26 to 34. But mainly because I have weak cpu. I almost got to the normal score which is shown in the 1st page of this forum. I also use PCI E 2.0
You shouldn't have to have the latest state of the art CPU just to play games.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,013
91
You shouldn't have to have the latest state of the art CPU just to play games.
I5 2400 wasn't super powerful back in 2011. Overclocking will help, and is what made the 2500k amazing, but it will be limited as non-k.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
What really hurts AMD is how ahead Nvidia is with their texture compression. That's the real key behind the lower power efficiency.
If AMD manages to get a similar compression technology they will be pretty sure on par with power consumption.
No its not its the fact that AMD have ACEs and HWS is where the power overhead comes from, texture compression saves bandwith not power.
 

maddie

Diamond Member
Jul 18, 2010
3,387
2,332
136
No its not its the fact that AMD have ACEs and HWS is where the power overhead comes from, texture compression saves bandwith not power.
Not true. Most of the power consumption in computation is for data movement. Moving less bits generally always result in less power consumption.
 

brandonmatic

Member
Jul 13, 2013
199
21
81
I guess it's same to assume that the 480 will beat the 1060 in most - if not all - MS XBone ports. So, the 480 will most likely be the leader in the newer APIs.

Little off topic - I haven't heard much about the Primitive Discard Accelerator at all. Does anyone know how it's performing? Does it work in dx12/dx11? Is it something that will need specific optimizations for?
Yeah, the RX 480 will almost certainly be faster than the 1060 in newer games - and maybe by a lot. But the 1060 is still more power efficient by a good margin.
 

Yakk

Golden Member
May 28, 2016
1,574
272
81
Yeah, the RX 480 will almost certainly be faster than the 1060 in newer games - and maybe by a lot. But the 1060 is still more power efficient by a good margin.
I'm sure there are also plenty of other cards along with the 1060 which are slower than the 480 and use less power.
 
Feb 19, 2009
10,458
5
76
Yeah, the RX 480 will almost certainly be faster than the 1060 in newer games - and maybe by a lot. But the 1060 is still more power efficient by a good margin.
If it's certain that the RX 480 will be faster in newer games, why do you assume the 1060 is more power efficient?

Efficiency for GPU is performance relative. Take for example Forza or Doom, rig A uses 200W, rig B uses 230W. Rig B is 25% faster. Rig B is therefore more efficient.

As an example:

 

boozzer

Golden Member
Jan 12, 2012
1,549
17
81
If it's certain that the RX 480 will be faster in newer games, why do you assume the 1060 is more power efficient?

Efficiency for GPU is performance relative. Take for example Forza or Doom, rig A uses 200W, rig B uses 230W. Rig B is 25% faster. Rig B is therefore more efficient.

As an example:

damn, that is pretty much check mate.
 

sirmo

Golden Member
Oct 10, 2011
1,011
374
136
If it's certain that the RX 480 will be faster in newer games, why do you assume the 1060 is more power efficient?

Efficiency for GPU is performance relative. Take for example Forza or Doom, rig A uses 200W, rig B uses 230W. Rig B is 25% faster. Rig B is therefore more efficient.

As an example:

Been saying this but it's hard to get through to people sometimes.. efficiency is heavily workload dependent. GCN is obviously designed for these LL APIs. And despite having all the extra compute logic rx480 is pretty efficient when properly utilized, like in that example.
 

biostud

Lifer
Feb 27, 2003
15,310
600
126
Going DX12 only makes zero sense. What about Win 7 & 8 users? Linux? OSX?

Go Vulkan and target them all. Don't know why these publishers do not see that, id Software has laid the foundation for them, show how it's done. Wasting time with DX11, DX12 then porting to OpenGL for Linux & OSX... ew.
Win 7 and win 8 users should have updated to win 10 when it was free.
 

ASK THE COMMUNITY