[HardOCP] AMD and NVIDIA GPU Vive VR Performance in Raw Data

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

xpea

Senior member
Feb 14, 2014
429
135
116
So, they ran numbers for a single GameWorks VR game that is in early access. We know [H] is out for AMD, but this basically reads like an nVidia PR statement.

How can anybody give an honest suggestion for what card to get based on a sample set of 1? I have no doubt that a 1080 is going to be faster than an RX 480, but its obvious there is an issue with this game on AMD hardware when an RX 480 and a Fury get the exact same performance.
but when we talk about a single Vulkan game that is not yet optimized for a certain brand, some has no issue to draw a definitive conclusion :hmm:
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
but when we talk about a single Vulkan game that is not yet optimized for a certain brand, some has no issue to draw a definitive conclusion :hmm:

Because an optimized game from known developers is rated the same as a game that just came out in alpha status two weeks ago?

And how do you guys keep forgetting that Nvidia has been working directly with id on Vulkan support? They had id up on stage to show off 1080 running Vulkan and id said they have been working daily with Nvidia.

Yet thats compared to a alpha game that uses gameworks vr?:hmm:
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
Because an optimized game from known developers is rated the same as a game that just came out in alpha status two weeks ago?

And how do you guys keep forgetting that Nvidia has been working directly with id on Vulkan support? They had id up on stage to show off 1080 running Vulkan and id said they have been working daily with Nvidia.

Yet thats compared to a alpha game that uses gameworks vr?:hmm:

There are certain poster who are still posting Hitman 2016 DX12 benchmark ,which is an early access game. Even, developer admitted that it is a early access game then why review sites and some people are still posting that beta game benchmarks? That is called double standard.

Even Total Warhammer DX12 is at beta stage and still some poster are regularly posting that result.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
There are certain poster who are still posting Hitman 2016 DX12 benchmark ,which is an early access game. Even, developer admitted that it is a early access game then why review sites and some people are still posting that beta game benchmarks? That is called double standard.

Even Total Warhammer DX12 is at beta stage and still some poster are regularly posting that result to fulfill their agenda.

Hitman isn't "Early Access", it is Episodic.


You have the option to use DX11 if you want instead of DX12 if its slower.

This game literally came out two weeks ago in early access as a Nvidia Gameworks showcase.


http://www.geforce.com/whats-new/ar...sed-performance-accelerated-by-nvidia-vrworks

Its not DX12, Vulkan or anything, its UE4.
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
Hitman isn't "Early Access", it is Episodic.


You have the option to use DX11 if you want instead of DX12 if its slower.

This game literally came out two weeks ago in early access as a Nvidia Gameworks showcase.


http://www.geforce.com/whats-new/ar...sed-performance-accelerated-by-nvidia-vrworks

Its not DX12, Vulkan or anything, its UE4.
Then why you do not post Dota 2 and The Talos Principle Vulkan benchmark?

I never saw people posting it or discussing it. Because AMD is not winning on those benchmarks.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Because no one in this circle cares about those games? Dota 2 can be run on anything and TP's Vulkan implementation seems to be experimental and pointless. DX11 is the API to use. AMD appears to be winning there too. Barely anyone even bothered to bench them.

Doom got positive reception, has lots of players and is technically ambitious. Its implementation of Vulkan is great too, as long your card can handle it. Why wouldn't there be a lot of discussion?

Hitman isn't early access either, and it seems to run pretty nicely, at least it does now. People really like the latest episode.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Then why you do not post Dota 2 and The Talos Principle Vulkan benchmark?

I never saw people posting it or discussing it. Because AMD is not winning on those benchmarks.

Where are up to date DOTA 2 and Talos benchmarks? Not ones from when it first came out and was 20-30% slower than DX11 for both AMD and Nvidia.

Its worthless posting benchmarks when there are nothing but performance decreases using low level APIs.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
http://www.geforce.com/whats-new/ar...sed-performance-accelerated-by-nvidia-vrworks



http://www.pcgamesn.com/nvidia-show...st-person-shooter-raw-data-unleashed-on-steam

The only thing i can conclude from this thread is that gameworks™ seems to be working as intended..

Seems like you will only get a worthwhile player experience on a gtx 1080/1070 :thumbsup: (i try to avoid recommending legacy hardware)

Time to upgrade those 1060's again :)

But this could just be intended to show how things are right now, not necessarily to advertise Nvidia. Just my 2 cents
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Then why you do not post Dota 2 and The Talos Principle Vulkan benchmark?

I never saw people posting it or discussing it. Because AMD is not winning on those benchmarks.

It's really not that complicated.

It's not about whether or not a games DX12/Vulkan implementation is alpha/beta/early access, or whatever, it's about whether or not the implementation provides a performance boost relative to DX11. No gamer is going to willingly use an API if it provides worse performance than the alternative.

So it basically goes like this:

AMD :
Games where DX12/Vulkan is better:
  • Ashes of the singularity
  • Rise of the Tomb Raider (with latest patch)
  • Hitman
  • DOOM
  • Total War: Warhammer
Games where DX12/Vulkan performance is a wash or depends upon setup:
  • DOTA 2
Games where DX12/Vulkan is worse:
  • Talos Principle

Nvidia:
Games where DX12/Vulkan is better:

Games where DX12/Vulkan performance is a wash or depends upon setup:
  • DOOM
  • Hitman
  • Ashes of the singularity
  • Rise of the Tomb Raider
  • Total War: Warhammer

Games where DX12/Vulkan is worse:
  • Talos Principle
  • DOTA 2

Games where DX12/Vulkan is relevant for both brands by default (since there isn't any alternative API available):
  • Gears of War
  • Quantum Break
  • Forza 6 Apex

PS. The above is based upon my memory of benches of the mentioned games. Newer patches/drivers may have changed things.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
but when we talk about a single Vulkan game that is not yet optimized for a certain brand, some has no issue to draw a definitive conclusion :hmm:

I don't think I have ever said that.

But not really comparable anyway. Doom is a released game where both GPU makers have been working with ID for the last 7 months. It is fully optimized for both GPU makers. As I doubt it would take nVidia longer to optimize than AMD when they both have full dev access. And Doom is sponsored by nobody.

The game mentioned here is not only Alpha, but its a gameworks game. Which means the dev is not allowed to work with AMD.
 
Mar 10, 2006
11,715
2,012
126
I don't think I have ever said that.

But not really comparable anyway. Doom is a released game where both GPU makers have been working with ID for the last 7 months. It is fully optimized for both GPU makers. As I doubt it would take nVidia longer to optimize than AMD when they both have full dev access. And Doom is sponsored by nobody.

That's the impression AMD would like us to get :)

Seriously, the Vulkan renderer for that game really seems like it's chock-full of AMD-specific optimizations (the devs even implied as such in the RX 480 promo video).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's really not that complicated.

It's not about whether or not a games DX12/Vulkan implementation is alpha/beta/early access, or whatever, it's about whether or not the implementation provides a performance boost relative to DX11. No gamer is going to willingly use an API if it provides worse performance than the alternative.

Is performance the only difference between the API's in these games, or have any of them added any added features?
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Is performance the only difference between the API's in these games, or have any of them added any added features?

None of them have added features to my knowledge, and if they did I don't see why they couldn't also be added to the DX11 version (in other words, new features wouldn't be a result of switching to a new API in and of itself, but simply a result of the developers trying to make the new version stand out more).
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
Yea, also explains the weird [H] recommendation of a VR card based on an early access title. Kyle really has a grudge against AMD and IMO the site has gone off the deep end because of it. Leading up to the launch of the 480 he made some crazy claims, some things he said a professional reviewer should just know better (things like 14nm is supposedly 30% faster than 16nm.) Which is very unfortunate because I actually prefer their review style over other sites hands down.

I stopped going there a while back when things stopped making any kind of logical sense. Looks like they are continuing on that road. That VR blog is just shameless promotion IMHO. Too bad, but it is what it is.

Just move on to other sites.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
That's the impression AMD would like us to get :)

Seriously, the Vulkan renderer for that game really seems like it's chock-full of AMD-specific optimizations (the devs even implied as such in the RX 480 promo video).

Funny that - Nvidia were boasting about the GTX1080 running Doom under Vulkan before AMD did:

http://www.geforce.co.uk/whats-new/...-gtx-1080-run-doom-at-up-200-fps-using-vulkan

Nvidia on May 8th 2016 said:
The stunning GeForce GTX 1080 is our fastest and most efficient graphics card to date. During our livestream announcement of the GeForce GTX 1080, the game-changing graphics card was shown running games at max detail levels at 2560x1440 at high framerates, powering new technologies that will make your games better, more beautiful and more immersive, and doing a whole lot more. Now, we're going to show the GeForce GTX 1080 running something new that's never been seen before.
At a post-announcement event for technology journalists, id Software made a surprise appearance to demo the new Doom. Running on a GeForce GTX 1080, the framerate was a solid 60 with every setting turned to Ultra. It was then revealed that the demo was in fact running on Vulkan, a cross-platform low-level graphics API, making Doom the first AAA game to use the exciting new technology. But that wasn't all: for their big finale, id Software announced that the framerate was being artificially capped to 60 -- switching the cap off, Doom ran at up to 200 FPS during the multi-minute live demo!
By eliminating most of the overheads present on DirectX 11, and by utilising new technologies and techniques, Vulkan can greatly accelerate framerates in games. And with the GeForce GTX 1080 you can push those framerates higher and higher.
For a chapterised re-run of the GeForce GTX 1080's unveiling check out our YouTube channel, and for further info about the announcements we made take a look at our articles. For more info on the GeForce GTX 1080's release later this month, stay locked to GeForce.co.uk.

It seems Nvidia has no issues with the performance of their cards under Vulkan even three months ago.
 

eddman

Senior member
Dec 28, 2010
239
87
101
its a gameworks game. Which means the dev is not allowed to work with AMD.

I see this more than often. Was this ever proven? Probably only a part of the game's source code is gameworks, not all of it, like all the other games that have gameworks features in them. Can NVidia prevent the developer from optimizing the rest of the code for AMD?

It's really not that complicated.

Last information was that only RotTR is optimized for Pascal's async. Has this changed?

As for this VR benchmark; one game, which is also early access, is not enough to draw any meaningful conclusions. It's just a single data point.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Last information was that only RotTR is optimized for Pascal's async. Has this changed?

If by "only optimized for Pascal" you mean that Pascal is the only Nvidia architecture that it is optimized for*, then no this doesn't appear to have changed, Maxwell/Kepler still doesn't support async compute in RotTR to my knowledge.

*It is of course also optimized for all of AMD's GCN architectures except for GCN 1.0
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
I see this more than often. Was this ever proven? Probably only a part of the game's source code is gameworks, not all of it, like all the other games that have gameworks features in them. Can NVidia prevent the developer from optimizing the rest of the code for AMD?



Last information was that only RotTR is optimized for Pascal's async. Has this changed?

As for this VR benchmark; one game, which is also early access, is not enough to draw any meaningful conclusions. It's just a single data point.
This is a proper VR benchmark of few games and Nvidia is ahead.

https://www.youtube.com/watch?v=I8vOLoUwcyc
 

Yakk

Golden Member
May 28, 2016
1,574
275
81
I see this more than often. Was this ever proven? Probably only a part of the game's source code is gameworks, not all of it, like all the other games that have gameworks features in them. Can NVidia prevent the developer from optimizing the rest of the code for AMD

Yes nvidia can prevent developers from optimizing for AMD in different ways. Their most often used tactic is releasing a gameworks code update close to the game's release date which does not leave time for AMD to reverse engineer a driver hack to bypass indirect interfering code. Even if the original game code was Radeon friendly.

As for the developer legal bindings, the only people who can really answer that question are under NDA to use gameworks libraries. And there are no "leaks" to these confidential documents.

Read into that what you will.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
If by "only optimized for Pascal" you mean that Pascal is the only Nvidia architecture that it is optimized for*, then no this doesn't appear to have changed, Maxwell/Kepler still doesn't support async compute in RotTR to my knowledge.

*It is of course also optimized for all of AMD's GCN architectures except for GCN 1.0

ROTTR doesn't seem to use async compute on AMD hardware, it's 1% performance difference from enabled vs disabled by editing the registry key for it.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Titling it a "VR Leaderboard" from a single title seems excessively try [H]ard, even by Kyle.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I see this more than often. Was this ever proven? Probably only a part of the game's source code is gameworks, not all of it, like all the other games that have gameworks features in them. Can NVidia prevent the developer from optimizing the rest of the code for AMD?

nVidia has different agreements with different companies, and we only know what these are if the dev chooses to tell us.

The first dev to bring this to light was CDProjectRed, the maker of The Witcher 3. They were explicitly forbidden from working with AMD. So AMD was required to reverse engineer in order to get performance gains.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
ROTTR doesn't seem to use async compute on AMD hardware, it's 1% performance difference from enabled vs disabled by editing the registry key for it.

That doesn't necessarily tell you anything about whether or not it's actually being used. It's perfectly possible to use async compute and not gain any performance increase.

Anyway to get back on topic, has anyone seen if any sites out there has recieved a VRScore kit (they should have gone out in June)? benchmarking GPUs and HMDs in VR doesn't really make any sense until we have something like that.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Let's stay on topic? Gets really old here to constantly hear 'AMD is a genius for dx12 and it created it' as well as 'NV is optimized in this game so they must have paid someone off'. :(
 

eddman

Senior member
Dec 28, 2010
239
87
101
If by "only optimized for Pascal" you mean that Pascal is the only Nvidia architecture that it is optimized for*, then no this doesn't appear to have changed, Maxwell/Kepler still doesn't support async compute in RotTR to my knowledge.

*It is of course also optimized for all of AMD's GCN architectures except for GCN 1.0

No, I don't mean "only for pascal". I mean that the majority of the current DX12 games do not even utilize Pascal's async, besides RotTR. They basically don't support it (at least that's what Ryan wrote) meaning that they are not representative of its true DX12 performance. Kepler and Maxwell cannot do async anyway, not in a way that would increase performance.

Yes nvidia can prevent developers from optimizing for AMD in different ways. Their most often used tactic is releasing a gameworks code update close to the game's release date which does not leave time for AMD to reverse engineer a driver hack to bypass indirect interfering code. Even if the original game code was Radeon friendly.

As for the developer legal bindings, the only people who can really answer that question are under NDA to use gameworks libraries. And there are no "leaks" to these confidential documents.

Read into that what you will.

But gameworks only applies to certain effects and features in the game that can be disabled, and not the entire source code of the game. Isn't that so?

If that's the case, then NVidia shouldn't be able to prevent the developer from optimizing the non-gameworks code for AMD.

nVidia has different agreements with different companies, and we only know what these are if the dev chooses to tell us.

The first dev to bring this to light was CDProjectRed, the maker of The Witcher 3. They were explicitly forbidden from working with AMD. So AMD was required to reverse engineer in order to get performance gains.

Can they legally do that? Do we know for sure that CDProjectRed agreed to such agreement? Are there any sources?

Again, I'm talking about non-gameworks part of the source code. Gameworks's code is obviously locked to NVidia only.
 
Last edited: