AMD & NVIDIA GPU VR Performance in Call of Starseed

Unreal123

Senior member
Jul 27, 2016
223
71
101
Since AMD currently has no new high end GPUs to compete with NVIDIA's GTX 1000 series, you might think the top end is looking bleak in terms of competition. However, that does not mean that you cannot have a great experience with new VR games, just as we have shown you here today in Call of Starseed. The fact of the matter is that most current VR titles are simply not that demanding in terms of GPU usage. But of course we are looking for VR games that are demanding in these reviews.



While the R9 Radeon Fury X was slowest in terms of average GPU Render Times at the High preset, the VR gaming experience with the HTC Vive was same across all our high end GPUs. The R9 Fury X, GTX 980 Ti, GTX 1070, and GTX 1080 produced render times well below our 11.1ms maximum in order to keep our Vive headset out of Reprojection. We had no issues with dropped frames while the Vive was running at its optimum 90 frames per second as well. With Starseed you can even push the game up to is maximum in-game settings with all these high end cards to further enhance your gameplay, and that is an important point with this game.



Even when we look at our RX 480 and GTX 1060 performance at the High IQ preset the performance of both is extremely solid as well. The RX 480 starts to slip into Reprojection just a bit, but not enough to truly impact our experience. The fact that you are likely not making a lot of snapping head-turns in this game, Reprojection is not truly an issue unless you find yourself extremely sensitive to it. The GTX 1060 is just as good as the GTX 1080 in this game at the High IQ level
http://www.hardocp.com/article/2016...r_performance_in_call_starseed/6#.V6sxO62XolA
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
is it a gimpworks game?

The whole reason for [H] testing Starseed was actually to avoid Gameworks (and thus the appearance of impropriety).

With that being said though, I still personally think that these kind of benchmarks are a bit pointless since they still don't test the most important thing: Motion-To-Photon latency

We probably won't really get to see proper VR benchmarking until a site gets their hands on a VRScore Trek unit.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Odd that they are still using the "Leaderboard" considering it's only two games now, and one ran awful on AMD hardware and was a Gameworks VR title.

This game runs just as good on AMD hardware as NVidia, but the conclusion page still is able to paint AMD in a horrible light due to the other game.
 

pj-

Senior member
May 5, 2015
501
278
136
The whole reason for [H] testing Starseed was actually to avoid Gameworks (and thus the appearance of impropriety).

With that being said though, I still personally think that these kind of benchmarks are a bit pointless since they still don't test the most important thing: Motion-To-Photon latency

I don't think that's a particularly useful test. MTP latency is more the responsibility of the headset's runtime than of the GPU being used. For a given headset and runtime version, that latency would be pretty much the same for any game that maintains 90fps. Both Oculus and Vive are "good enough" that you don't notice what small latency there is. Both are said to be under 20ms and I think oculus has claimed to be < 10ms with ATW.

The latency of pressing a button on a controller and seeing how long it takes before there's a visible reaction on screen is different, and wouldn't really affect the VR experience unless it was horrendous.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I don't think that's a particularly useful test. MTP latency is more the responsibility of the headset's runtime than of the GPU being used. For a given headset and runtime version, that latency would be pretty much the same for any game that maintains 90fps. Both Oculus and Vive are "good enough" that you don't notice what small latency there is. Both are said to be under 20ms and I think oculus has claimed to be < 10ms with ATW.

The latency of pressing a button on a controller and seeing how long it takes before there's a visible reaction on screen is different, and wouldn't really affect the VR experience unless it was horrendous.

MTP latency is a result of both the headset's runtime, the GPU used, usage of AMD/Nvidia's latency centric features form their respective VR suites, and whatever other VR centric features the game in question is using (ATW, reprojection etc.). All of these add up to the final MTP latency.

For instance we know that due to some of Nvidia's earlier GPUs having poor pre-emption they couldn't make efficient use of ATW, and thus the effective MTP latency would be higher than otherwise. This isn't captured in the linked review, but should be captured with VRScore Trek.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I don't think that's a particularly useful test.
I think Motion to Photon latency is an extremely useful test... I can't possibly imagine how that wouldn't be helpful. If you're looking at buying a new rig, and (these numbers aren't true - they're made up hypotheticals) nVidia + Vive = 75ms average latency, nVidia + Oculus = 50ms average latency, AMD + Vive = 35ms average latency, and AMD + Oculus = 65 average latency, you're telling me that isn't useful?
 

pj-

Senior member
May 5, 2015
501
278
136
I think Motion to Photon latency is an extremely useful test... I can't possibly imagine how that wouldn't be helpful. If you're looking at buying a new rig, and (these numbers aren't true - they're made up hypotheticals) nVidia + Vive = 75ms average latency, nVidia + Oculus = 50ms average latency, AMD + Vive = 35ms average latency, and AMD + Oculus = 65 average latency, you're telling me that isn't useful?

Like I said, the latency you're talking about is different. The makers of both PC headsets target 20ms MTP latency for head movement as a maximum. I haven't experienced higher latency, but 20ms has been accepted as the rough upper limit that is not noticeable to humans.

The latency of me pressing the trigger and my gun firing in game is disconnected from how quickly my physical head movements are reflected by what's displayed on the screens. They are not just games being rendered to screens that happen to be strapped to your face.

Oculus in particular uses ATW on every frame, even when games are running at 90fps. As a result its latency is extremely low, even if the GPU's full pipeline latency is high. The runtime reprojects the latest completed frame with new headset position/rotation data so what you see is as up to date as possible. Measuring the GPU's contribution to MTP latency would be basically impossible on a rift because of ATW.

As a vive owner I can tell you that missed frames and % of time spent in reprojection are far larger concerns to me than the degrees of imperceptible latency between GPUs.

MTP latency is a result of both the headset's runtime, the GPU used, usage of AMD/Nvidia's latency centric features form their respective VR suites, and whatever other VR centric features the game in question is using (ATW, reprojection etc.). All of these add up to the final MTP latency.

For instance we know that due to some of Nvidia's earlier GPUs having poor pre-emption they couldn't make efficient use of ATW, and thus the effective MTP latency would be higher than otherwise. This isn't captured in the linked review, but should be captured with VRScore Trek.

Yes I'm sure there are differences, but who cares? These headsets would be constant vomit fests if the latencies weren't at or below acceptable levels. I've heard no complaints from people with 970s and 390xs (aside from thinking they're entitled to run games at the highest settings with 2x SS enabled), and with how sensitive people are to imperfections in VR, if there was a problem with latency we would know about it.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Yes I'm sure there are differences, but who cares? These headsets would be constant vomit fests if the latencies weren't at or below acceptable levels. I've heard no complaints from people with 970s and 390xs (aside from thinking they're entitled to run games at the highest settings with 2x SS enabled), and with how sensitive people are to imperfections in VR, if there was a problem with latency we would know about it.

Who cares about differences? I would say that the ones vomiting would care quite a bit.

Sensitivity is extremely subjective, some can easily handle high latency, whilst others will not even be able to handle 20ms. And I have seen plenty of people with 970s, 390Xs and other GPUs complaining. In most cases I think the complaining is a problem with the game and not the hardware (developers are still learning how to get a handle on these things, and how to optimize for latency instead of throughput), but it is impossible to know for certain without a test setup like the one I linked.
 

pj-

Senior member
May 5, 2015
501
278
136
Who cares about differences? I would say that the ones vomiting would care quite a bit.

Sensitivity is extremely subjective, some can easily handle high latency, whilst others will not even be able to handle 20ms. And I have seen plenty of people with 970s, 390Xs and other GPUs complaining. In most cases I think the complaining is a problem with the game and not the hardware (developers are still learning how to get a handle on these things, and how to optimize for latency instead of throughput), but it is impossible to know for certain without a test setup like the one I linked.

But people aren't throwing up because of latency. Nausea in current VR games is 99%+ due to artificial locomotion or missed frames. I am sensitive to nausea in VR and have had no problems in any game except Project Cars which made me feel like shit for hours.

It may be possible but I have not heard of a single person getting sick in VR playing a room scale (e.g. non-artificial locomotion) game running at consistent 90fps. I check the vive and oculus reddits 5-10 times a day so if it was happening someone would probably complain (people complain about anything) and I would probably see. My first vive experience was at a microsoft store demo which used a gtx 970. I felt no difference between that and my home computer with a 980ti. I haven't tried it with VR yet, but I expect my Titan XP will provide the same experience as well, except with SS a bit higher.

Testing latency is useful in the sense that any new knowledge is good, and improvements by the headset makers may add subtly to the sense of "presence", but your original statement is completely wrong.

"these kind of benchmarks are a bit pointless since they still don't test the most important thing: Motion-To-Photon latency"

I think anyone who has used VR extensively would prioritize improvements in missed frames, reprojection, aliasing, tracking jitter, and black levels, over improvements to MTP latency.
 

cytg111

Lifer
Mar 17, 2008
25,970
15,426
136
I cant tell if those titles are sporting SMP(and AMD same same)? Cause if they aint .. then there is a whooole lot of potential performance lurking in the shadows just one little patch away.. (that could effectively make 480~ level the goto cards and 1070+ super overpowered..)
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
But people aren't throwing up because of latency. Nausea in current VR games is 99%+ due to artificial locomotion or missed frames. I am sensitive to nausea in VR and have had no problems in any game except Project Cars which made me feel like shit for hours.

It may be possible but I have not heard of a single person getting sick in VR playing a room scale (e.g. non-artificial locomotion) game running at consistent 90fps. I check the vive and oculus reddits 5-10 times a day so if it was happening someone would probably complain (people complain about anything) and I would probably see. My first vive experience was at a microsoft store demo which used a gtx 970. I felt no difference between that and my home computer with a 980ti. I haven't tried it with VR yet, but I expect my Titan XP will provide the same experience as well, except with SS a bit higher.

You're absolutely right that games with artificial locomotion (which would generally be most non-room scale games), are the ones most prone to nausea, but the reason why these are the ones most prone to nausea, is exactly due to MTP latency.

With artificial locomotion you tend to have your character move faster than in room scale games, and thus any increase in MTP latency will be worse here. Imagine that you're playing a game where you a moving 0.5 meter per second (a room scale game, where you move around leisurely), and a game where you are moving at 50 meters per second (a driving game for instance). With a latency of 30ms, the slow moving game will be off by 1.5 cm, whereas the fast moving game will be off by 150 cm, which is huge. Now in driving games the movement is most often fairly predictable and thus the game can in theory try to predict your future position instead of your current position and use this positional data when rendering the frame, but his is something the developers specifically has to optimize for (hence my previous statement that developers are still getting the hang of VR and how to best optimize for latency).

And for the record the ones I have seen complaining have not been for room scale either, it has primarily been for "piloted" games (i.e. driving sims and the like).

Testing latency is useful in the sense that any new knowledge is good, and improvements by the headset makers may add subtly to the sense of "presence", but your original statement is completely wrong.

"these kind of benchmarks are a bit pointless since they still don't test the most important thing: Motion-To-Photon latency"

I think anyone who has used VR extensively would prioritize improvements in missed frames, reprojection, aliasing, tracking jitter, and black levels, over improvements to MTP latency.

Missed frames and reprojection would also be captured in a MTP latency test. Aliasing and black level is more of an IQ thing than a performance thing, so not really relevant in this context.
 

pj-

Senior member
May 5, 2015
501
278
136
You're absolutely right that games with artificial locomotion (which would generally be most non-room scale games), are the ones most prone to nausea, but the reason why these are the ones most prone to nausea, is exactly due to MTP latency.

With artificial locomotion you tend to have your character move faster than in room scale games, and thus any increase in MTP latency will be worse here. Imagine that you're playing a game where you a moving 0.5 meter per second (a room scale game, where you move around leisurely), and a game where you are moving at 50 meters per second (a driving game for instance). With a latency of 30ms, the slow moving game will be off by 1.5 cm, whereas the fast moving game will be off by 150 cm, which is huge. Now in driving games the movement is most often fairly predictable and thus the game can in theory try to predict your future position instead of your current position and use this positional data when rendering the frame, but his is something the developers specifically has to optimize for (hence my previous statement that developers are still getting the hang of VR and how to best optimize for latency).

No, latency has nothing specific to do with sickness when using artificial locomotion. The discomfort is caused by a disconnect in what your brain is seeing (motion) and what your inner ear is feeling (no motion). A hypothetical 0ms latency headset would still make me nauseous in Project Cars kart racing because I am not feeling the turns, acceleration, and elevation changes that I am seeing. It is the same root cause of nausea as when I'm reading in a car. My brain is seeing no motion, but my body is feeling the slight swaying and bumps in the road.

Artificial locomotion will never be comfortable for everyone. There is research into how to mitigate the effects, but they are looking at stuff like FOV constriction and vestibular stimulation, not latency reduction.

Your example doesn't make sense. If you are in a VR driving game your body doesn't feel any of the motion you see, regardless of latency. Acceleration will feel just as unnatural with 10ms latency as it does with 50ms. Your brain doesn't feel connected to the game world so it doesn't matter if your car is 1.5m behind where it "should" be.

I don't have the technical knowledge to explain it, but I am certain that speed/direction/acceleration of artificial locomotion and how they relate to motion sickness are completely independent of MTP latency.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
No, latency has nothing specific to do with sickness when using artificial locomotion. The discomfort is caused by a disconnect in what your brain is seeing (motion) and what your inner ear is feeling (no motion). A hypothetical 0ms latency headset would still make me nauseous in Project Cars kart racing because I am not feeling the turns, acceleration, and elevation changes that I am seeing. It is the same root cause of nausea as when I'm reading in a car. My brain is seeing no motion, but my body is feeling the slight swaying and bumps in the road.

Artificial locomotion will never be comfortable for everyone. There is research into how to mitigate the effects, but they are looking at stuff like FOV constriction and vestibular stimulation, not latency reduction.

Your example doesn't make sense. If you are in a VR driving game your body doesn't feel any of the motion you see, regardless of latency. Acceleration will feel just as unnatural with 10ms latency as it does with 50ms. Your brain doesn't feel connected to the game world so it doesn't matter if your car is 1.5m behind where it "should" be.

I don't have the technical knowledge to explain it, but I am certain that speed/direction/acceleration of artificial locomotion and how they relate to motion sickness are completely independent of MTP latency.

First of all it's incorrect to say that the disconnect between what you see and what your inner ear register is a case of motion vs. no motion. Nausea is caused when both your eyes and inner ear registers motion, but this motion does not match up. It is also possible to get nausea from seeing motion without feeling motion, but if someone suffers from this, then they would likely also get nausea from simply watching a movie in first person (something like Hardcore Henry for instance), and as such would probably suffer a great deal with VR even with latency below 20 ms.

The example I outlined with the car can also cause nausea, but it is admittedly not a particularly good example ,since it is a bit more complex than what I described and there are a number of other factors at play. A better example would simply be a game in which you turn your head rapidly over and over (i.e. a fast paced FPS for instance). It is possible to turn your head by up to 1500 degrees per second, but more realistically you will probably max out at around half that most of the time. So with a 30ms latency you would be able to turn your head by 22.5 degrees or about 20% of the horizontal FOV of these headsets. Having the image being off by 20% is plenty enough to cause nausea. Now to be honest I can't really think of any fasted paced FPS VR games out there currently of the top of my head, but the VR version of DOOM may very well fit the bill.
 
Last edited:

pj-

Senior member
May 5, 2015
501
278
136
First of all it's incorrect to say that the disconnect between what you see and what your inner ear register is a case of motion vs. no motion. Nausea is caused when both your eyes and inner ear registers motion, but this motion does not match up. It is also possible to get nausea from seeing motion without feeling motion, but if someone suffers from this, then they would likely also get nausea from simply watching a movie in first person (something like Hardcore Henry for instance).

The example I outlined with the car can also cause nausea, but it is admittedly not a particularly good example ,since it is a bit more complex than what I described and there are a number of other factors at play. A better example would simply be a game in which you turn your head rapidly over and over (i.e. a fast paced FPS for instance). It is possible to turn your head by up to 1500 degrees per second, but more realistically you will probably max out at around half that most of the time. So with a 30ms latency you would be able to turn your head by 22.5 degrees or about 20% of the horizontal FOV of these headsets. Having the image being off by 20% is plenty enough to cause nausea. Now to be honest I can't really think of any fasted paced FPS VR games out there currently of the top of my head, but the VR version of DOOM may very well fit the bill.

There have been low level driver and runtime optimizations to poll the headset data at the latest possible moment before the frame begins rendering, and that position includes an estimation of where your head will be when the frame is eventually drawn on the displays.

It is not the typical sequence of:
1. poll input
2. do simulation
3. render frame on gpu
4. draw to display

With VR, the headset is polled at about step 2.999, which cuts out a lot of OS/driver/game latency.

In a system where there is 50ms latency between the start of step 3 and the end of step 4, the effective latency would still be basically 0ms when your head is in predictable motion. For example, if you are moving the headset in a straight line at 1m/s, the runtime can predict very accurately where it needs to place the virtual camera to line up with what your brain is expecting. I'm not sure how advanced the prediction algorithms are but they seem to be pretty good.

John Carmack says an easy test for latency is to look at a horizontal edge in a VR and roll your head side to side. If the edge appears to tilt, then there is too much latency in the system. He says the effect is minor with 50ms latency, and practically imperceptible at 20ms and below.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
There have been low level driver and runtime optimizations to poll the headset data at the latest possible moment before the frame begins rendering, and that position includes an estimation of where your head will be when the frame is eventually drawn on the displays.

It is not the typical sequence of:
1. poll input
2. do simulation
3. render frame on gpu
4. draw to display

With VR, the headset is polled at about step 2.999, which cuts out a lot of OS/driver/game latency.

In a system where there is 50ms latency between the start of step 3 and the end of step 4, the effective latency would still be basically 0ms when your head is in predictable motion. For example, if you are moving the headset in a straight line at 1m/s, the runtime can predict very accurately where it needs to place the virtual camera to line up with what your brain is expecting. I'm not sure how advanced the prediction algorithms are but they seem to be pretty good.

John Carmack says an easy test for latency is to look at a horizontal edge in a VR and roll your head side to side. If the edge appears to tilt, then there is too much latency in the system. He says the effect is minor with 50ms latency, and practically imperceptible at 20ms and below.

There is no fixed time at which games calls position data, it depends entirely upon what the game engine is setup to do and when the application calls the ovr_GetPredictedDisplayTime function (for oculus), obviously the later the better, but either way it happens before rendering starts and also often before a lot of the draw calls are generated by the CPU, it is important to note though that position tracking can itself have significant latency (10ms or more for the camera based tracking, only about 1 ms for the IMU based tracking, sensor fusion will mitigate this). Also your sequence is missing a couple of step, including but not limited to: USB transfer latency, scanout latency, pixel switching latency,

You do not get 0ms latency when your head's in predictable motion, simply because motion extrapolation is never perfect, and as such can only ever compensate for so much, which is why it remains important to minimise and thus measure the total latency.

Either way the whole point is that latency is far and away the most important performance metric for VR, and it is not just a question of the HMD, but is also highly influenced by the game and GPU. Everyone who works in VR agrees on this.
 

pj-

Senior member
May 5, 2015
501
278
136
It was important in the development of these headsets, but it's not an issue in day to day use because it is evidently good enough. People can't tell the difference in latency between the vive and rift. The oculus runtime uses synchronous timewarp after rendering every frame, so there is no way its latency is above like 15ms (I forget the exact number but I think it is lower than that even). If all vive games are > 15ms, and nobody is noticing or complaining, why is it so important to measure it? From the consumer perspective, if all the measurements are below the point at which it is a problem, what does it matter?

I have over 50 games and demos for my vive and not once have I noticed latency or gotten uneasy feelings after playing one of them (p cars excepted). I have however experienced some frame drops and falling back to reprojection in almost all of them.

I don't think any consumer of VR does or should care about MTP latency ahead of those two. Minimizing it may subtly improve the experience, but there are factors that much more obviously impact the experience and my stomach.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
It was important in the development of these headsets, but it's not an issue in day to day use because it is evidently good enough. People can't tell the difference in latency between the vive and rift.

Note that I'm not so much talking about the latency produced by the headsets themselves (which as you mentioned should be quite low for both), I'm talking about latency produced by the game and by the CPU/GPU. Since game and GPU performance is exactly what [H] is trying to measure (and not headset performance), latency produced by these would be extremely important.

The oculus runtime uses synchronous timewarp after rendering every frame, so there is no way its latency is above like 15ms (I forget the exact number but I think it is lower than that even).

Note that [H] is not using a rift, but rather a Vive, so ATW doesn't apply here. Even so ATW also has it's limitations, which is why Oculus themselves only recommends using it as a safety net, and not as something to rely on constantly.

If all vive games are > 15ms, and nobody is noticing or complaining, why is it so important to measure it? From the consumer perspective, if all the measurements are below the point at which it is a problem, what does it matter?

VR experience is not a binary thing, where latency above 20 ms results in a vomit comit experience, and latency below 20 ms is utter perfection. So knowing the exact latency has clear value.

Also you argument that nobody is complaining is a bit silly. Console gamer generally don't complain about framerate in their games even if it's often only somewhere around 30 FPS (only if the game consistently dips below 30, will you see people complain), but if you tried to convince people on this forum that FPS higher than 30 FPS is pointless, since console gamers don't complain about it, you would be laughed out of here.

I have over 50 games and demos for my vive and not once have I noticed latency or gotten uneasy feelings after playing one of them (p cars excepted). I have however experienced some frame drops and falling back to reprojection in almost all of them.

Frame drops and reprojection is a result of latency, basically due to the game not having a new frame ready within the 11ms window. But if GPU A is constantly reprojecting due to it taking 12 ms to finish whereas GPU B is constantly reprojecting due to it taking 20 ms, then you would absolutely want to know, since that would indicate that reprojection can be fixed for GPU A with just a light overclock or slightly lowered settings, whereas GPU B is kinda screwed.

I don't think any consumer of VR does or should care about MTP latency ahead of those two. Minimizing it may subtly improve the experience, but there are factors that much more obviously impact the experience and my stomach.

Again dropped frames and reprojection is a result of rendering latency exceeding a preset threshold (10ms for Vive), and since rendering latency is a subpart of MTP latency it absolutely matters.
 

pj-

Senior member
May 5, 2015
501
278
136
Latency and throughput are not the same thing. You could have a game with 500ms latency that still hits 90fps with 0 frame drops.

If you're claiming that dropped frames and reprojection are related to latency, then aren't these existing tests indirectly measuring latency and therefore not as useless as you say?

Also re: oculus I was talking about Synchronous timewarp, which is applied on every frame that completes on time to give the lowest headset latency possible. ATW is for when a frame is taking too long and the runtime needs to give the headset a positionally updated frame to prevent judder.

And yes, the average consumer is dumb. I think I have become pretty sensitive to these kinds of things since I got heavily into PC gaming. Playing console games on my tv is annoying because of the input lag. I can tell when games drop below 80 fps on my gsync monitor. The fact that VR is so intense of an experience, I would expect people to be complaining about games feeling "floaty" or "laggy" if latency was a problem. You can get that sensation somewhat with reprojection, but I have not heard anyone talk about it when games aren't reprojecting.

People can and will measure the MTP latency with VR. I am just saying that based on my experience so far, the results will not matter to me. If my Titan XP for some reason has 30ms latency and my old 980ti has 15ms, I'm not going to switch back because the Titan is going to beat its ass in render time, reprojection and dropped frames.