Better Performance Forcing VSync in Forceware?

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
I've got a 24 inch widescreen monitor so image tearing is a big problem for me. As a consequence, enabling VSync is a necessity in almost every game (COD2 is a notable exception). That means I'm taking a performance hit times two--running at 1920x1200 resolution and running with VSync. I've always been told that enabling VSync within the game was superior to forcing VSync in my video drivers, but I'm beginning to think that this is bad advice.

Recently, I found that I picked up about 10FPS when I enable VSync and Tripple Buffering in Forceware advanced options, rather than using the in-game VSync in F.E.A.R. (I'm using a hacked widescreen solution). Since every increase in playable frame rate you pick up in F.E.A.R. allows you to add more visual goodies, I can now use 2x anti-aliasing. Very nice! :thumbsup:

Anybody else have any experience in forcing VSync in Forceware?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
The triple buffering option only works OpenGL games so it isn't doing anything for you in FEAR; but as for why you saw a difference between enabling it in game or in drivers I have no clue as they should both be doing the exact same thing.
 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
So maybe it's just forcing VSync at the driver level rather than enabling it in-game?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Woofmeister
So maybe it's just forcing VSync at the driver level rather than enabling it in-game?

Yes, but generally all a game does when you check that option is tell the driver to turn on VSync. :p

Maybe the game is not turning on triple buffering for some reason (either it doesn't naturally support it, or it's bugged)? I don't have an NVIDIA card right now, or I'd try to play around with it.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,579
10,215
126
Sounds like enabling it in Forceware is allowing it to "render ahead", which amortizes the rendering costs of each frame over a larger number of frames, resulting in greater overall efficiency, at some cost to interactive latency.
 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Originally posted by: VirtualLarry
Sounds like enabling it in Forceware is allowing it to "render ahead", which amortizes the rendering costs of each frame over a larger number of frames, resulting in greater overall efficiency, at some cost to interactive latency.
You just completely made that up didn't you? :laugh:
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: Woofmeister
Originally posted by: VirtualLarry
Sounds like enabling it in Forceware is allowing it to "render ahead", which amortizes the rendering costs of each frame over a larger number of frames, resulting in greater overall efficiency, at some cost to interactive latency.
You just completely made that up didn't you? :laugh:


Lol. He must be an accountant with that verbage.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,579
10,215
126
Originally posted by: Woofmeister
You just completely made that up didn't you? :laugh:
No, that was a summarization of a thread I read on the MS DirectX dev mailing list, mostly from devs describing problems caused by lack of control over the driver-level implementation of this thing. You see, allowing the driver to fake vsync to the app, and then allowing the app to render ahead more frames, while the driver buffers them up and feeds them to the display output device (one buffer per displayed frame, assuming that vsync is enabled in the driver and the overall rendering speed is faster than the display output frame-rate), allows the driver to semi-cheat on benchmarks. But it also tends to negatively affect player-control feedback/interactive latency. Most notably by a player sensation of "input lag".