• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

computerbaseAshes of the Singularity Beta1 DirectX 12 Benchmarks

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I'm not into the software side of things so I don't really know how it works.

Reading your statement makes me wonder what your saying.

I decipher your statement and I came to the following conclusion....It's not fair to implement it that way if AMD gpus benefit from it and nVidia doesn't....Correct?
That is correct. The problem is what is unfair? You have consoles with GCN, and PC's with GCN which can switch contexts properly, and you have Apple on the other side of the world with GCN architecture aswell. To overcome the problem of context switching you have to add specific lines of code just for Nvidia hardware into the application. Otherwise the performance will always tank on Nvidia hardware.

It is simply easier to code multiplatform apps with GCN in the first place in mind, later - for Nvidia.
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
No, it is the job of the developer to implement it for every hardware in a way that it works.

Oxide has demanded a low level API and more access to the GPUs. Blaming the hardware vendor when something doesnt work, it just an excuse.

Guess we won't be hearing any of the it's AMD's lack of support or drivers statement from you anymore.

It works doesn't it? Maybe not the way you want it to tho.

Why isn't your phylosophy implemented in Gameworks titles? I guess it is as it works the way you want it to....Guess it's a visual fidelity thing.

Seems to me the best way to do it is to just use the DX12 implementation as its written/standardized and let the cards fall where they fall.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
DX12 is a low level API. The developer has much more control. He has much more responsibilities, too.

But i guess people believe that DX12 is just another DX version. And you can still blame drivers for the performance.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
That is correct. The problem is what is unfair? You have consoles with GCN, and PC's with GCN which can switch contexts properly, and you have Apple on the other side of the world with GCN architecture aswell. To overcome the problem of context switching you have to add specific lines of code just for Nvidia hardware into the application. Otherwise the performance will always tank on Nvidia hardware.

It is simply easier to code multiplatform apps with GCN in the first place in mind, later - for Nvidia.
that reads like scary long term strat by AMD.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
that reads like scary long term strat by AMD.

Not really. For Nvidia it is enough if they will add Hardware Scheduler and second asynchronous compute engine. It will make GIGANTIC difference.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
DX12 is a low level API. The developer has much more control. He has much more responsibilities, too.

But i guess people believe that DX12 is just another DX version. And you can still blame drivers for the performance.

Still lost by your lack of thinking it thru.

Your saying that only in DX12 the developer needs to implement it in a way that makes performance equall?

Are you saying nvidia drivers suck in DX12?

Or is it that the current nvidia designs aren't really made with DX12 in mind?
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
I don't think it has anything to do with Nvidia drivers. I think two things are at play:

1.) AMD has never had as good of DX11 drivers as Nvidia.
2.) AMDs cards on a hardware/architecture level are better suited for DX12
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I can't believe people are still getting hot under the collar about a single game that is:

  1. in beta,
  2. pre-performance optimisation,
  3. on a brand new API (no DX12 title is currently available),
  4. new API being implemented on older hardware*
* older hardware w.r.t. DX12. I suspect this will be controversial but; current hardware is, primarily, DX11 hardware. Furthermore, we do not know what the developers, AMD, Nvidia and microsoft are up to. You guys need to keep in mind how minuscule the amount of available information is.

It is the same conversation\argument we have had on this board with each new DX release. By the time DX12 becomes mainstream all of these tail end DX11 cards will be a distant memory. Nobody is going to care if a Fury beats a 980 in 2 years.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It is the same conversation\argument we have had on this board with each new DX release. By the time DX12 becomes mainstream all of these tail end DX11 cards will be a distant memory. Nobody is going to care if a Fury beats a 980 in 2 years.

Everyone will care in a few months when first DX-12 games will start to hit the market. And most of the people that bought those 390/Fury's and GTX970/80s will also care next year (2017) as well, not everyone change its GPU every 12 months.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Everyone will care in a few months when first DX-12 games will start to hit the market. And most of the people that bought those 390/Fury's and GTX970/80s will also care next year (2017) as well, not everyone change its GPU every 12 months.

This is the same argument people used in every DX release. A handful of half assed implementations come out year one. Mainstream is usually 2-3 years after that. By the time DX12 actually matters across a broad spectrum of games nobody cares because we are 2 generations into the DX12 cards.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Not really. For Nvidia it is enough if they will add Hardware Scheduler and second asynchronous compute engine. It will make GIGANTIC difference.
but it will mean every NV gpu out right now = trash once dx12 becomes mainstream. I guess it depends on the adoption rate of win 10.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
This is the same argument people used in every DX release. A handful of half assed implementations come out year one. Mainstream is usually 2-3 years after that. By the time DX12 actually matters across a broad spectrum of games nobody cares because we are 2 generations into the DX12 cards.

Sorry but people are buying R9 390/Fury and GTX970/80s TODAY (Q1 2016) and others will buy those GPUs in Q2 2016. Many of them will keep those cards for 2-3 years. That means 2018-2019, all those people will care how those cards will perform in 2 years.
Also, i dont expect much more performance from 14/16nm Cards in 2016 at the same price as current products.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Sorry but people are buying R9 390/Fury and GTX970/80s TODAY (Q1 2016) and others will buy those GPUs in Q2 2016. Many of them will keep those cards for 2-3 years. That means 2018-2019, all those people will care how those cards will perform in 2 years.
Also, i dont expect much more performance from 14/16nm Cards in 2016 at the same price as current products.

I am sure they are happy to know your concern for their purchase. The argument is tired and boring and predictable. Faux concern for cards and how they will perform in 2-3 years. When anybody buying high end gear wont stand for performance of said card in 2-3 years even if DX 12 never happened.

I on the other hand expect a good boost in performance from the first 14nm cards. And I am awaiting them to upgrade from my 770 to play a whole slew of DX10\11 games.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I am sure they are happy to know your concern for their purchase. The argument is tired and boring and predictable. Faux concern for cards and how they will perform in 2-3 years. When anybody buying high end gear wont stand for performance of said card in 2-3 years even if DX 12 never happened.

I expect the R9 290s I bought in the past half year to still be in use five years from now. The friends whose cards they replaced/will replace had 430s in them, and I expect them to last much better than 970s would've. Sometimes you buy with the intention of turning money into framerate as efficiently as possible and riding it out as long as you can.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I expect the R9 290s I bought in the past half year to still be in use five years from now. The friends whose cards they replaced/will replace had 430s in them, and I expect them to last much better than 970s would've. Sometimes you buy with the intention of turning money into framerate as efficiently as possible and riding it out as long as you can.

And what kind of performance do you expect out of these 290s in 5 years in both DX11 and 12? I suspect regardless of what you believe. Your 290 is going to be terrible in most higher end games released in 2021. So you may get a few more FPS than a 970. Great!
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I am sure they are happy to know your concern for their purchase. The argument is tired and boring and predictable. Faux concern for cards and how they will perform in 2-3 years. When anybody buying high end gear wont stand for performance of said card in 2-3 years even if DX 12 never happened.

I on the other hand expect a good boost in performance from the first 14nm cards. And I am awaiting them to upgrade from my 770 to play a whole slew of DX10\11 games.

On your first paragraph, I bolded the relevant thing you said. It is predictable that 390/Fury will out age 970/980.

On your second paragraph, do you not see the irony? You have the 770, a card that is now the ubiquitous epitome of poor aging. You should be concerned about cards aging poorly in 2-3 years because your card is doing just that.

I'm not gonna criticize your purchase because it could have made sense at the time for a number of reasons (AMD mining inflation making 280X untenable, for example). But as an owner of a 2-3 year old card that aged poorly you should be among the voices that points to the importance of seeking cards that will age better based on common sense past trends and future predictions.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I am sure they are happy to know your concern for their purchase. The argument is tired and boring and predictable. Faux concern for cards and how they will perform in 2-3 years. When anybody buying high end gear wont stand for performance of said card in 2-3 years even if DX 12 never happened.

I on the other hand expect a good boost in performance from the first 14nm cards. And I am awaiting them to upgrade from my 770 to play a whole slew of DX10\11 games.

The funny thing is that my HD7950 1GHz that was purchased before your GTX770 was launched, it was cheaper and will be faster in 2016-2017. Same with R9 290/390s vs GTX970/80s.

I was talking with two friends yesterday, one has the GTX670 2GB and the other the GTX680 2GB. The 680 2GB was 150-200 Euro more expensive than my HD7950 1GHz on date of purchased and now it is slower. It will be even worse in a few months with DX-12 games. He will still keep it in 2016 and suffer with low performance in new games.
The one with the GTX670 already talking about replacing the card with a new one. Although my HD7950 is still very capable for 1080p gaming, i have just purchased a new R9 390 because of the 144hz monitor and I will give my HD7950 to my brother in law to replace his HD6950 2GB.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
but it will mean every NV gpu out right now = trash once dx12 becomes mainstream. I guess it depends on the adoption rate of win 10.

Windows 10 adoption rate is significant. It's already passed Windows 8 & 8.1 in total systems and is expected to pass Windows 7 within 2 years.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
On your first paragraph, I bolded the relevant thing you said. It is predictable that 390/Fury will out age 970/980.

On your second paragraph, do you not see the irony? You have the 770, a card that is now the ubiquitous epitome of poor aging. You should be concerned about cards aging poorly in 2-3 years because your card is doing just that.

I'm not gonna criticize your purchase because it could have made sense at the time for a number of reasons (AMD mining inflation making 280X untenable, for example). But as an owner of a 2-3 year old card that aged poorly you should be among the voices that points to the importance of seeking cards that will age better based on common sense past trends and future predictions.


You may believe that is the relevant thing for you to point out. It isn't relevant to this conversation we have every time a new iteration of DX comes out. Regardless of your opinion on my purchasing decisions. Mainstream DX games take years to come out. By that time tail end cards are in the dust bin of history. Whether or not I bought a 280X or a 770 2 years ago would not stop me from plopping down money on a new 14nm card when they come out this year. And in 2-3 years when DX12 games become more mainstream not many will care either way how these cards perform. But feel free to start a thread on how cards released in 2015 perform in mainstream DX 12 games in 2018-19. It may generate some interest.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The funny thing is that my HD7950 1GHz that was purchased before your GTX770 was launched, it was cheaper and will be faster in 2016-2017. Same with R9 290/390s vs GTX970/80s.

I was talking with two friends yesterday, one has the GTX670 2GB and the other the GTX680 2GB. The 680 2GB was 150-200 Euro more expensive than my HD7950 1GHz on date of purchased and now it is slower. It will be even worse in a few months with DX-12 games. He will still keep it in 2016 and suffer with low performance in new games.
The one with the GTX670 already talking about replacing the card with a new one. Although my HD7950 is still very capable for 1080p gaming, i have just purchased a new R9 390 because of the 144hz monitor and I will give my HD7950 to my brother in law to replace his HD6950 2GB.

ok? I will have whatever 14nm card fits my needs while you play with that 7950. I think we are getting beyond my point and instead waving our e-peens.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
but it will mean every NV gpu out right now = trash once dx12 becomes mainstream. I guess it depends on the adoption rate of win 10.

I am sorry for saying this but... we knew that since we understood the architectures, and how they menage low-level multithread API like DX12. Like I said in previous post. For Nvidia it will change everything if they will add to GPUs second ACE and HWS.

The problem is this. Both of those hardware features are used by HSA 2.0. And We know very well that Nvidia will support it. Only through CUDA.
The question is really right now: what does that mean on hardware level. But from my perspective it looks like another layer of abstraction. Simple architecture that is extracted with software. Propertiary software to add.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Whether or not I bought a 280X or a 770 2 years ago would not stop me from plopping down money on a new 14nm card when they come out this year. And in 2-3 years when DX12 games become more mainstream not many will care either way how these cards perform.

You still have no acknowledge of the irony. It may even be obtuse enough to warrant the label of hypocrisy.

Faux concern for cards and how they will perform in 2-3 years. When anybody buying high end gear wont stand for performance of said card in 2-3 years

You bought high end gear 2-3 years ago and are standing for its performance right now.

But feel free to start a thread on how cards released in 2015 perform in mainstream DX 12 games in 2018-19. It may generate some interest.

You have seen the multiple threads with dozens of posts about which cards have aged well, aged poorly, or just a thread praising the 7970? Talking about cards 3-4 years old? I hope for your sake you were not being sarcastic.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I am sorry for saying this but... we knew that since we understood the architectures, and how they menage low-level multithread API like DX12. Like I said in previous post. For Nvidia it will change everything if they will add to GPUs second ACE and HWS.

GM200 is a better DX12 card than Fiji. GM204 is beating Tonga without problems. AMD has no advantage with DX12.
Looks like that AMD needs to catch up with them. D:

The problem is this. Both of those hardware features are used by HSA 2.0. And We know very well that Nvidia will support it. Only through CUDA.
The question is really right now: what does that mean on hardware level. But from my perspective it looks like another layer of abstraction. Simple architecture that is extracted with software. Propertiary software to add.
HSA? Is this not this thing nobody cares about?!