Virtual V-sync

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Thats not true. I have the M/B up and running with the software . In lucid performance mode . Its like running sli. Have a read . I have a video also but someone else can post it . If they find it . I will wait for more post like yours befor I link to it. See how I am.
http://vr-zone.com/articles/is-lucid-s-virtu-mvp-the-next-big-thing-for-gaming-/15070.html
You aren’t really posting anything useful. Repeatedly linking to a press release entitled “…the next big thing for gaming?” doesn’t mean anything. I know how it works; nobody’s claiming there’s no performance increase.

You still need drivers that rely on application specific optimizations to get scaling, something Lucid promised wouldn’t happen. Being able to mix AMD and nVidia GPUs is no big deal.

As for the performance claims, let’s see some real benchmarks and game compatibility testing.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
You aren’t really posting anything useful. Repeatedly linking to a press release entitled “…the next big thing for gaming?” doesn’t mean anything. I know how it works; nobody’s claiming there’s no performance increase.

You still need drivers that rely on application specific optimizations to get scaling, something Lucid promised wouldn’t happen. Being able to mix AMD and nVidia GPUs is no big deal.

As for the performance claims, let’s see some real benchmarks and game compatibility testing.

Already posted the video link. Ya its all lies only. Man you guys never stop do ya.

Not to take away from this topic . But the real news is outside my window . The Sun rise today was Great . But we have a problem the sun is way to far N for this time of here . On my sundial its like in place its at place in june . Fantastic sunrise. try using a welders helment. I know off topic but what a site
 
Last edited:

Khato

Golden Member
Jul 15, 2001
1,251
321
136
As for the performance claims, let’s see some real benchmarks and game compatibility testing.

Heh, but how does one even go about benchmarking it? Especially the combined virtual vsync + hyperformance? There's no question that the fps is going to increase for any hardware that can exceed the refresh rate of the monitor in that case seeing as how the hyperformance mode results in the video card doing less work, but does it provide any tangible benefits?

Even if the hardware is incapable of exceeding the refresh rate there's the possibility of it still inflating the fps without actually resulting in more frames being rendered. But if it lives up to the claim of making 30 fps tolerable instead of jerky on limited hardware there still might be something to be said for it.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Since we all have to make this point again and again, I'm going to use as little effort as possible:
https://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking


http://en.wikipedia.org/wiki/Reaction_time

The mean reaction times for sprinters at the Beijing Olympics were 166 ms for males and 189 ms for females, but in one out of 1,000 starts they can achieve 109 ms and 121 ms, respectively


That's about 10x greater than the frequency of new information at @ 60 Hz, on a server running at 60 ticks, with cl_updaterate 60, cl_cmdrate 60.

Stop pretending that pumping out 300 fps on a 1000 tick rate servers makes you a better player, cause it doesn't.

In a controller environment (same frame rate and same latency) your brain learns to compensate. That's what makes you a good player. Adjusting your interp cvar does not.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I have the MSI 80 series with the new Virtual V-sync . Man its is great . Its = in value to a new highend GPU . The Virtual V-sync great higher than 60fps without tearing . and the performance mode . I can't believe what fraps is telling me . This is like magic .

I can't say much but here is a story about it . Just so ya know the MSI M/Bs are the ones ya really want with this feature. + ya get thunderbolt .

http://vr-zone.com/articles/is-lucid-s-virtu-mvp-the-next-big-thing-for-gaming-/15070.html
How? Through what mechanism is your monitor taking frames faster than it is specified to?

The only case I can think of would be if Vsync itself could be software-controlled, such that a slight delay between rendering and refresh could be managed in software so that the frame is not sent to the monitor a few ms late, instead of waiting for the next refresh cycle. Do common display standards (DVI, HDMI, DP) allow this sort of thing? I would think that if it were, we'd have been using it already.
 

John Peterson

Junior Member
May 3, 2012
2
0
0
Can someone clearly and soberly explain this technology?

This page says that "In i-Mode [display connected to iGPU], it transfers the last completed frame that the GPU has processed into the iGPU, and then when the display requires a refresh, it can process that entire frame.". This is clear enough, but how does it work in d-mode [display connected to dGPU]?

Pending an understand of its operation in d-mode this is how I describe the features.

  • iGPU QSV and dGPU available in the same Windows session: Only beneficial.
  • Virtual Vsync in both i-mode and d-mode: Always better or equal input lag than regular vsync.
  • HyperFormance in both i-mode and d-mode: Always better or equal input lag than without it.
Making the iGPU QSV and the dGPU available in the same Windows session has only benefits and is a feature that is easy to understand.

Virtual Vsync reduce the input lag at all frame rates by displaying the last complete frame without idling the GPU, compared to regular v-sync that idles the GPU instead of drawing a more current frame when all display buffers have a complete frame. The benefit can be noticed also when the frame rate is on average lower than the refresh rate because there might occur inexpensive frames that would have made regular vsync idle the GPU instead of drawing a more current frame. It also frees around 8 MB vRAM compared to regular v-sync by making triple buffering unnecessary. The downside with not idling the GPU is that the temperature and fan speed is maintained at the full GPU load level.

HyperFormance reduce the input lag at all frame rates by removing frames from the rendering queue that will not be shown on screen. At a low average frame rate there might still occur inexpensive frames that can be rendered faster than the refresh rate so that they can be removed from the rendering queue. At higher frame rates there are on average more frames available to be removed from the rendering queue.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
This sounds like some kind of equivalent of triple buffering?

It reduces the "input" latency associated with vsync but doesn't eliminate it entirely, I still find triple buffering awkward to use for anything that requires fast and responsive feedback such as the mouse control in FPS games.

V-sync is a technology which no matter how you slice it, is adding some degree of artificial delay to the display of frames to the screen and so "input" lag is just inherent to the technology and will always be turned off for me because of that.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
justforjag.png
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Not sure where you're going with this, but your ms paint skills are awful.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
If you have a higher framerate, you see a new frame faster.
If you see a frame faster, you can react faster.
It doesn't matter what your reaction time is; if an old granny with a reaction time of 40 seconds observes a frame at 0 seconds or 0.009 seconds, she will be reacting at 40 and 40.009 seconds respectively.

How much time can we take away from total response time from rendering at a higher framerate? Consider the following:

It takes 16.66ms for your monitor to receive a single frame at 60hz. If you are rendering at exactly 60hz, and the frame transmission is perfectly in sync with your monitor, the very last line will be over 16ms out of date.
At an infinite framerate, every single line transmitted to the monitor will be the most up to date information available.

If something happens a single picosecond after the 60hz computer finishes rendering a frame, you won't know about it until the next frame 16ms later. On the computer with an infinite framerate, we see it as soon as the line gets scanned in. Is it a huge deal? By itself, not really. However the time savings really do start adding up after you start combining different things.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
If you have a higher framerate, you see a new frame faster.
If you see a frame faster, you can react faster.
It doesn't matter what your reaction time is; if an old granny with a reaction time of 40 seconds observes a frame at 0 seconds or 0.009 seconds, she will be reacting at 40 and 40.009 seconds respectively.

How much time can we take away from total response time from rendering at a higher framerate? Consider the following:

It takes 16.66ms for your monitor to receive a single frame at 60hz. If you are rendering at exactly 60hz, and the frame transmission is perfectly in sync with your monitor, the very last line will be over 16ms out of date.
At an infinite framerate, every single line transmitted to the monitor will be the most up to date information available.

If something happens a single picosecond after the 60hz computer finishes rendering a frame, you won't know about it until the next frame 16ms later. On the computer with an infinite framerate, we see it as soon as the line gets scanned in. Is it a huge deal? By itself, not really. However the time savings really do start adding up after you start combining different things.


But the point is that your monitor can only display information 60 times per second, so being able to render 1000 FPS on a server that has 1000 tick rate vs. another player who renders only 60 FPS and has his rates set to 60, will give you no advantage at all because you don't see it before he does. And your mouse doesn't move more smoothly or accurately either.

I don't know why this is hard for you to understand.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
-mostly a fad according to hardware.fr

-Tweaktown shows miss and hit

-3dmark 11... apparently Lucid has no problem doing what if done by nvidia or AMD would be called cheating

and anandtech says:

"With this new technology, the FPS value is almost meaningless as it counts the frames that are not rendered. "
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
But the point is that your monitor can only display information 60 times per second, so being able to render 1000 FPS on a server that has 1000 tick rate vs. another player who renders only 60 FPS and has his rates set to 60, will give you no advantage at all because you don't see it before he does. And your mouse doesn't move more smoothly or accurately either.

I don't know why this is hard for you to understand.
facepalm
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Also puzzled as to how it will do this if your monitor is plugged into your video card and not the output video port of the motherboard ?

When they say discrete, that aren't talking about an addon card. They are talking about when a motherboard has two onboard chips; both an intergrated chip such as an HD4000 for low power and a discrete chip such as a Nvidia 650M for more GPU intensive apps. Laptops have been shipping with dual GPU for awhile now and now they are using that tech on desktops. The V-Vsync they are showing is only applicable if you use their onboard GPUs. Of course there is nothing keeping Nvidia or ATI from doing something similar with their drivers.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
But the point is that your monitor can only display information 60 times per second, so being able to render 1000 FPS on a server that has 1000 tick rate vs. another player who renders only 60 FPS and has his rates set to 60, will give you no advantage at all because you don't see it before he does. And your mouse doesn't move more smoothly or accurately either.

I don't know why this is hard for you to understand.

I'm not sure if this is the point being made but...you have to remember that frequency of updates is not the only thing that effects the latency between your input and the output on screen.

When your frame rate exceeds your refresh rate, and you use vsync you render into the back buffer, the card then artificially waits doing nothing until the back/front buffers are flipped to supply the new frame to the monitor.

If that wait period is very long and the card is very fast at rendering frames then it's possible it could have drawn another frame in that wait time which would have newer information in it from the game world.

For example with 1000fps each frame takes 1ms to render, on a 60hz monitor it's 16.666ms between each update. That means when the buffer flip happens the next frame is generated in 1ms which is 15.666ms BEFORE the monitor is ready to make another refresh, when you finally get your next refresh your frame is 15.666ms old.

That's the whole point of triple buffering is to allow the video card to keep working in the background, draw a frame at 14.666ms, 13.666ms, 12.666ms... and 0.666ms, when the buffer flips again your newest frame is only 0.666ms old

This reduces the actual latency effect while still maintaining only a fixed 60hz. I think that's what Ben90 was saying?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Can someone clearly and soberly explain this technology?

This page says that "In i-Mode, it transfers the last completed frame that the GPU has processed into the iGPU, and then when the display requires a refresh, it can process that entire frame.". This is clear enough, but how does it work in d-mode?
In d-Mode Lucid would be creating the 3rd buffer for the display chain on the dGPU instead of the iGPU. Otherwise most other aspects are similar.