TweakTown: Lucid VirtuMVP tested on Asrock-Z77+3770K - GTX680vs7870

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
(you meant 2560x1600, right?)
perhaps with driver maturity this will improve, but still the 1600p is a very niche segment and those people will probably have a better card than a 7870 in their rig ;)
 
Last edited:

utahraptor

Golden Member
Apr 26, 2004
1,052
199
106
I don't know why, but I had high hopes for the Lucid Virtu MVP. This is kind of a let down.

I will hesitate to use it just based on those charts. It seems it is a roll of the dice if this will slow down or speed up your game and then it even depends on the settings you have enabled at that moment.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Seeing as it's virtually free, I think it's a good feature to have.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,268
136
Seeing as it's virtually free, I think it's a good feature to have.

Indeed. At the present though you'd definitely need to test and apply it only where it works on a case by case basis.

But what about minimums? More load on the CPU may have a negative effect there...
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Seeing as it's virtually free, I think it's a good feature to have.
Amen. Give it some time for the software to mature to get performance consistent and this will be a must have for me.
 

MarkLuvsCS

Senior member
Jun 13, 2004
740
0
76
Wow that looks pretty amazing. I mean with the GTX 680 the software did boost FPS in every scenario, and not just by like 1-2 fps either! I would love to see how this affected games on budget centric cards. Seems like a nice way to bring budget cards + igpus with intel more in line with AMD's APUs.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Rumor has it that NVIDIA are also going to put a stop to the technology being able to work in future driver releases.

that's why i love NVIDIA :awe:
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Impressive. Really would have been nice to see what the CPU utilization looked like. Anyone have any idea how much extra load this puts on the CPU?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Why is this being so hyped up?

4651_24_lucid_virtu_mvp_hyperformance_tested_with_asrock_z77_and_intel_ivy_bridge.png


Let's cover the issues we have with the technology in this early stage of Virtu MVP's life. The first is that when we really need the extra FPS in certain areas, not only do we not see it, but we're seeing a negative impact in performance in some cases. As always we aim for that 60 FPS mark and under Mafia II for example, we see our 74 FPS average move to 101 FPS at 1680 x 1050. This is great, but I'd say we're already at a playable level with a strong 14 FPS over that 60 FPS average being seen. At 2560 x 1600, though, we move from 47 FPS, a number we consider unplayable, to 37 FPS, not only still an unplayable number, but a lower number! It's not just Mafia II either; in our small sample pool here you can see that when we need the extra FPS in important areas, we don't get it.

What I thought would happen with the technology then was proven today. When we need the gains, we just don't get them. Unfortunately at the moment we're getting a bit of a negative effect. It seems that "HyperFormance" helps remove stuff like your CPU bottleneck. The issue is, at 2560 x 1600 under really intensive games, there is no CPU bottleneck - it's all a VGA bottleneck.

So what's it for again? I can understand wanting to see higher numbers, but the difference between 60fps and 70fps isn't something that I find very interesting, particularly when you consider that it won't give you the FPS gains when you're at low fps and would actually want it.

With higher resolutions the GTX680 would see the same issues the 7870 was experiencing. I guess for those who demand 120hz/FPS then it would be a god send, and yes free FPS is free FPS but I'd much rather have a 30>40FPS boost than a 100>150.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Wow that looks pretty amazing. I mean with the GTX 680 the software did boost FPS in every scenario, and not just by like 1-2 fps either! I would love to see how this affected games on budget centric cards. Seems like a nice way to bring budget cards + igpus with intel more in line with AMD's APUs.
you might want to look at that again
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
All of these articles are getting this all wrong. MVP is not about boosting your framerate, it's about reducing input lag. Anyone benchmarking HyperFormance and stating "it makes the game faster" (or versions thereof) needs to be hit with a clue-by-four.

Virtual V-Sync: Work-around for a lack of true triple buffering in D3D. Use a 3rd buffer, keep letting the GPU work on frames and always display the latest frame. This is as opposed to D3D style buffering, which uses a buffer queue that stops rendering once the queue is full. End result: input lag with v-sync is reduced because the displayed frame didn't necessarily start rendering 16.6ms ago at the start of the buffer swap.

Hyperformance: Use it with Virtual V-Sync. Console-style frame rendering time prediction. If it's too early to render a frame, render dummy frames* until it's time to render a frame. End result: input lag with v-sync is reduced because a frame is rendered at the last possible moment.

* Dummy frame: a frame with as many draw calls as possible stripped out. This frame will not be displayed, but it's necessary to render this frame because some rendering methods require intra-frame data. Because draw calls are being stripped out, things can go very wrong if Lucid's middleware screws up (which is why you sometimes see performance drop).
 
Last edited:

Alaa

Senior member
Apr 26, 2005
839
8
81
All of these articles are getting this all wrong. MVP is not about boosting your framerate, it's about reducing input lag. Anyone benchmarking HyperFormance and stating "it makes the game faster" (or versions thereof) needs to be hit with a clue-by-four.

Virtual V-Sync: Work-around for a lack of true triple buffering in D3D. Use a 3rd buffer, keep letting the GPU work on frames and always display the latest frame. This is as opposed to D3D style buffering, which uses a buffer queue that stops rendering once the queue is full. End result: input lag with v-sync is reduced because the displayed frame didn't necessarily start rendering 16.6ms ago at the start of the buffer swap.

Hyperformance: Use it with Virtual V-Sync. Console-style frame rendering time prediction. If it's too early to render a frame, render dummy frames* until it's time to render a frame. End result: input lag with v-sync is reduced because a frame is rendered at the last possible moment.

* Dummy frame: a frame with as many draw calls as possible stripped out. This frame will not be displayed, but it's necessary to render this frame because some rendering methods require intra-frame data. Because draw calls are being stripped out, things can go very wrong if Lucid's middleware screws up (which is why you sometimes see performance drop).
Are we going to see anything related to micro-stuttering or anything related to SLi/XFire?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Are we going to see anything related to micro-stuttering or anything related to SLi/XFire?
Virtual V-Sync alone shouldn't do anything significant in that regard, however if Hyperformance were to screw up its predictions, from what I gather it could.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
If you're using a fps limiter together with vsync, input lag is gone almost completely, anyway. This virtu stuff doesn't make any sense to me at all.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
Has anybody else noticed that whenever you get < 60 FPS your FPS get reduced further. There is no instance contrary to this in the benchmarks.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Rumor has it that NVIDIA are also going to put a stop to the technology being able to work in future driver releases.
Wouldn't surprise me in the least. Just look at PhysX and SLI. Both with artificial restrictions placed on them through drivers. They're a lot like Apple in that respect. If they don't want want you using something, they will actively go out of their way to disable it rather than letting the enthusiast community play with it on the side caveat emptor.
 

utahraptor

Golden Member
Apr 26, 2004
1,052
199
106
I think it would be a bad marketing move for Nvidia or AMD to attempt to purposely sabotage the Virtu MVP via drivers. I have no problem with a benchmark adding something to the results that say "Virtul MVP has effected this score." On the other hand, motherboard makers obviously have something at stake here. They are investing in this Virtu MVP to help sell their boards. To me this would be like a USB chipset maker putting something in their driver that would prevent firewire ports on the same board from working.