• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[David Kanter on Tech Report] - Nvidia VR preemption "possibly catastrophic".

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Ha, you're absolutely right: https://www.youtube.com/watch?v=zFCxjXe3f1M

Oh and here is nVidia with timewarps: https://www.youtube.com/watch?v=uwGDg_SegDg

12ms latency. 🙄

So 1 guy out of the thousands using the new SDK and one demo showing Nvidia can barely do 12ms on a static, low res image. Quality links there, I'm gonna go email asap Kanter and tell him how wrong he is.

He's obviously over embellishing a point. Elite with a DK2 works great on my Nvidia hardware.

This is a case where most people haven't tried better so don't know what real stutter-free, low-latency VR is like. Elite on DK2 will look like asteroids compared to real VR. There is a lot of black, still shots in Elite too - it will be a different case when planets landing is available.
 
I am actually qualified to fly most aircraft up to small jets so flying for me is not an issue, it's the loss of peripheral vision and the disconnect in the audio-visual synchronisation that gets to me.

For me, it is latency. The more real the game/simulation, the better the latency needs to be or I get nauseous.
 
So 1 guy out of the thousands using the new SDK and one demo showing Nvidia can barely do 12ms on a static, low res image. Quality links there, I'm gonna go email asap Kanter and tell him how wrong he is.



This is a case where most people haven't tried better so don't know what real stutter-free, low-latency VR is like. Elite on DK2 will look like asteroids compared to real VR. There is a lot of black, still shots in Elite too - it will be a different case when planets landing is available.

So when you say stuff like this,

If people really understood how far ahead AMD is in VR they'd be stunned. There is even bigger news to come from AMD and Crytek. Nvidia doesn't have the hardware or software, it's like a 286 on DOS vs an i7 on Windows 10.

You are just talking out your .......

Well, that's great then. Good job!!
 
I watched the entire video with David Kanter. It's funny how someone can accurately quote him but somehow forget that after his possibly catastrophic quote about Nvidia assures people that NVidia will be fine and address the issue. In typical David Kanter style he mentions that if NVidia fans who are disgruntled want to "dump" their GTX 980s that they can send them his way;he'll be sure to use them.

The link to the video is in the OPs post. Watch the ENTIRE end segment of the video.
 
VR is a paradigm shift that will change the world and the way people think. This is no 3D TV. It's applications span far wider than entertainment, including research and education. This will open up huge opportunities for entrepreneurs far and wide, not just for a few TV manufacturers. The potential is huge and there's little wonder why AMD is driving this new opportunity so aggressively. The GCN architecture is the perfect match for these workloads and along with LiquidVR nobody else is close to matching these capabilities. There sure are a lot of developers working with GCN these days!
 
Last edited by a moderator:
I'm looking forward to seeing how reviewers bridge the gap into reviewing VR sets and video cards on VR sets. Pure FPS will be insufficient to really express the quality of a card/VR set combo. Latency between head motion and viewpoint motion, frame persistence, etc. will all need to be measured and quantified. Should be very interesting.
 
I'm looking forward to seeing how reviewers bridge the gap into reviewing VR sets and video cards on VR sets. Pure FPS will be insufficient to really express the quality of a card/VR set combo. Latency between head motion and viewpoint motion, frame persistence, etc. will all need to be measured and quantified. Should be very interesting.

there's probably no easily standardizable test.

i suppose if you could make something like valve's QR code room and attach a camera and pass through realtime/no-lag video to only one eye display, then 3d model a vr room with the same dimensions/features of the QR room you could get a decent measure of the lag based on the discrepancy of the 2 views. it would be a serious hack job though.
 
Well if you knew precisely how fast you could move a motorized mounted high speed camera, and if you could determine how many ms per frame captured by the camera, then you could do the math and calculate the latency difference. Youd just strap the vr set onto the high speed camera and set the motor always to the same setting. Whether there is an even enough or responsive enough motor is another question but im sure it could be managed with the kind of gear they use to shoot movies. Basically what TFT Central does to high refresh rate monitors to measure frame persistence, plus a motor track for the camera.
 
Last edited:
Well if you knew precisely how fast you could move a motorized mounted high speed camera, and if you could determine how many ms per frame captured by the camera, then you could do the math and calculate the latency difference. Youd just strap the vr set onto the high speed camera and set the motor always to the same setting. Whether there is an even enough or responsive enough motor is another question but im sure it could be managed with the kind of gear they use to shoot movies. Basically what TFT Central does to high refresh rate monitors to measure frame persistence, plus a motor track for the camera.

Maybe you could log the motion sensors on the headset and record footage at the same time. Then compare.

Or make an overlay that moves across the screen according to the sensor data. Gpu that can keep up with the overlay the best (or lead it) wins.
 
Well if you knew precisely how fast you could move a motorized mounted high speed camera, and if you could determine how many ms per frame captured by the camera, then you could do the math and calculate the latency difference. Youd just strap the vr set onto the high speed camera and set the motor always to the same setting. Whether there is an even enough or responsive enough motor is another question but im sure it could be managed with the kind of gear they use to shoot movies. Basically what TFT Central does to high refresh rate monitors to measure frame persistence, plus a motor track for the camera.

So basically we need NASA (or other similar organisations) to do it?
 
Motorized camera tracks aren't that complicated or hard to get dude.

No they are not but we do want it to be thorough and organisations like NASA have some of the best and most accurate testing technologies. Still imperfect due to the human element but better than most.
 
Take a garage, a camera and a bin picker robot (which can do reproducible movement patterns), mount on its arm the VR and a second camera filming the VR screen. The difference between the first camera capturing bin picker movemet and the second camera observing motion on the screen is your system lag. Not exactly cheap, but not NASA science either.
 
It doesn't matter - for Nvidia... for people who care about VR and DX12, they will just sell or give away their Maxwell cards and get the Pascal cards = possible Millions of units in sales again.
 
Back
Top