Does vsync work properly with SLI in Oblivion?

darXoul

Senior member
Jan 15, 2004
702
0
0
Guys, as I asked in the thread title, does vsync work with SLI in Oblivion? Two factory overclocked 7900 GTX's should give me enough power to run the game smoothly in high settings, and I don't care about the inability to do AA+HDR since I consider HDR quite overdone and unrealistic in most games - in Oblivion it looks great in some screenshots and horrible in others.

I just need to know whether vsync + SLI work because if they don't, it simply sucks. I've read a few reports around the web that SLI + vsync in Oblivion results in a huge fps drop down to the level of single card. Of course, it may be possible that folks didn't use triple buffering in order to avoid the fps drop with vsync activated :p

The point is that I can't stand tearing but I want a smooth frame rate. Therefore, no vsync in SLI would be pretty much a deal breaker to me since Oblivion functions as my pretext to get the otherwise relatively unnecessary and very expensive SLI rig :D
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
When I went from one 7800GT to two I noticed that the vsync didn't work as well since it is doing it to two GPU's. I didn't ever try the triple buffering though, by the time I found out how to actually activate it I had one card again. I was disappointed by this, but I also don't know how well triple buffering would work with two GPU's. So all I really know is that with SLI straight forward vsync sucks.
 

Tig Ol Bitties

Senior member
Feb 16, 2006
305
0
0
I must admit, with my 2 7800gtx, vsync sucks a lot. My frames drops like a mofo, sometimes by half, and I actually get crashes. Thankfully, the tearing on my monitor doesnt bother me very much if at all since its barely noticable, but who wouldnt want it to go away. Hopefully there will be a patch or something for us SLi users to fix this problem in Oblivion, otherwise this is still my most favorite game out of any genre, cant stop playing it, too many things to do.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Thanks for the feedback.

Heh, so maybe getting just one XTX would be better than GTX SLI... I personally really hate tearing, and I heard it's often twice as bad on SLI.

Anyone else with first hand experience?
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Usually, I don't bump, but hey - there MUST be more people with Oblivion and SLI on the boards! :)
 

Arkane13131

Senior member
Feb 26, 2006
412
0
0
I had 2 7900gt co superclocks for a week.

While playing oblivion... SLI seemed obnoxious... either the framerates were over 100 and tearing was crazy...or the infamous slowdown occured and dropped so drastically it wasnt smooth at all (the transitioning between the 2 i mean is so drastic it makes the game less enjoyable than a single 7900gt co superclock)

I figured enabling vsync would fix this..but the game crashes...runs really strange looking..and it just sucks sucks sucks. I recommend the single x1900xtx for >500$ from newegg oem. A single card is much more stable overall. I actually only scored around 12000->13000 in 3dmark05 with the dual 7900's and I believe the x1900xtx is close to that anyway.

cheaper...more stable... x1900xtx is my recommendation.
 

DasFox

Diamond Member
Sep 4, 2003
4,668
46
91
SLI and Oblivion RIGHT HERE!

I run vsync with no problems, give me a few days, I have a X-FI XtremeMusic on order, should be here on 4/13 and when I slap that in I'll give you all another report, but for me, in the past with it on, I was getting 40-50 fps outdoors and around 70-85 indoors with no problems at all.

ALOHA
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
it's not the vsync that's not working people

it's triple buffering not being enabled via driver in direct3d games
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: darXoul
Thanks for the feedback.

Heh, so maybe getting just one XTX would be better than GTX SLI... I personally really hate tearing, and I heard it's often twice as bad on SLI.

Anyone else with first hand experience?

I have and personally you should do that. I did and loved it. I've noticed a lot of improvements, especially with the vsync and screen tearing (and because of this, games look stunning while playing very very smoothly)

My experience through the whole thing can be read here

Basically, right now, I've noticed ATI to be better than Nvidia as far as image, gameplay, etc. ON MY COMPUTER. I'm not the overall video card God/Judge, but this has worked out better for me. I don't know if your specs are close to mine so....

Good luck in whatever you choose, you'll buy what you want and have fun dude.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
My current specs are nothing - I'm upgrading my computer right now, and I intend to buy 7900 GTX SLI or a single X1900XTX, 2 GB RAM and probably A64 3700+ OCed to 2.64 GHz. I'll be using either 1280*1024 or 1680*1050 in Oblivion, preferably with HDR/AA/AF, of course - if my rig can do it at an acceptable frame rate. If not, I can reduce some settings - grass shadows and self shadows seem redundant. HDR is also questionable for me, since sometimes it looks stunning, but in other situations unrealistic and overdone. We'll see...

Anyway, looks like the X1900XTX is the best option for Oblivion. Not that it's bad for other games ;) I'll have to replace the stock HSF though, I find it really bad - I heard it many times on X1800. Accelero X2 is the way to go but the major drawback is ofc void warranty :/
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: moonboy403
it's not the vsync that's not working people

it's triple buffering not being enabled via driver in direct3d games


nVidia drivers don't support triple buffering for D3D like ATi drivers do. Anyway, for those of you running SLI and want vsync without the massive FPS hit, get DXTweaker (requires .Net though) and it will allow you to run any game with triple buffering (punk buster detects it as a cheat though). Be warned it's a bit sketchy in Oblivion and causes random crashes as well as when using fast travel but it's still better than having your fps halved.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: darXoul
My current specs are nothing - I'm upgrading my computer right now, and I intend to buy 7900 GTX SLI or a single X1900XTX, 2 GB RAM and probably A64 3700+ OCed to 2.64 GHz. I'll be using either 1280*1024 or 1680*1050 in Oblivion, preferably with HDR/AA/AF, of course - if my rig can do it at an acceptable frame rate. If not, I can reduce some settings - grass shadows and self shadows seem redundant. HDR is also questionable for me, since sometimes it looks stunning, but in other situations unrealistic and overdone. We'll see...

Anyway, looks like the X1900XTX is the best option for Oblivion. Not that it's bad for other games ;) I'll have to replace the stock HSF though, I find it really bad - I heard it many times on X1800. Accelero X2 is the way to go but the major drawback is ofc void warranty :/



The Sapphire Blizzard (water cooled) card is on Newegg right now for $540 or something like that after MIR. Personally I'd rather just get the cheapest XTX possible and replace the cooler.
 

Elfear

Diamond Member
May 30, 2004
7,164
821
126
Originally posted by: 5150Joker

The Sapphire Blizzard (water cooled) card is on Newegg right now for $540 or something like that after MIR. Personally I'd rather just get the cheapest XTX possible and replace the cooler.

:thumbsup:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Yeah I also think that would be the way to go. HDR is worth is too (especially with the AA).
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
I'm not much of a FPS elitist but even at 1920x1200, 2xAA, 8xHQAF, HDR and a crap load of ini changes along with texture replacements, the game runs fine for me on a single x1900xtx. Indoors, where all the fighting occurs, the FPS is great. It does dip outdoors the but heavy action outdoors is quite rare and so far has been acceptable.

My only "lag" comes when the hard drive is working and that's likely due to only 1gb of RAM.