for some reason, even with the nvresfix for the 27.xx i still think its at 60hz

LS20

Banned
Jan 22, 2002
5,858
0
0
huh?

i installed 27.70

refresh at 60

reinstalled old refresh fix 0.169 (or whaetver) and it fixed the refresh right up again. no problems here
 

Brian48

Diamond Member
Oct 15, 1999
3,410
0
0
nvRefreshrate does not support any of the drivers in the v27.xx series (read the statement at the author's site). There's a modified version of the v27.42 and v27.51 that's floating around that's nvRefreshrate compatible. Just use the .inf file from either set in place of the regular nv4_disp.inf that comes with v27.70.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Without any modification the nvreffix program does not work with the 2742s or higher. It's possible to fiddle around with different files to get it to work but I prefer waiting for the next version of Riva Tuner (RC11) because it's much better than installing the nvreffix program.
 

xerx

Junior Member
Aug 23, 2001
24
0
0
I would just disable the V-sync but does that fix the refresh rate problem???

I also found that Nvidia install nView by default for the 27.xx series drivers. I disable them is the msconfig and my games doesnt lag anymore.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
would just disable the V-sync but does that fix the refresh rate problem???

No, vsync has nothing to do with the monitor's refresh rate.
 

xerx

Junior Member
Aug 23, 2001
24
0
0


<< would just disable the V-sync but does that fix the refresh rate problem???

No, vsync has nothing to do with the monitor's refresh rate.
>>



Can you tell me what V-sync for???
I saw in almost every post and they mention turn off the V-sync.
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0

Vsync is when your video card waits for your monitor to refresh the vertical on your screen. So what? How
is that important? Well, on some monitors, turning off Vsync will stretch the vertical (up and down) too
much and cause tearing in the textures, making the game hard to play. That is
what I experience. But the real question is whether is makes a difference on or off?

Monitors have a refresh rate which is measured in Hertz (Hz) most of the monitors default to 60Hz.. Your
average user will stay on that without knowing that there is better. Most run desktop at 1024x768 and
75Hz. That is about normal for advanced users, even getting into the low category. The higher the Hz the
faster the monitor refreshes the screen making less flicker and much easier on the eyes. But what does
this have to do with games? Most games have Vsync enabled by default... so that means that your game
can't run faster than your refresh rate. If you have a 60Hz refresh rate, you can't get higher than 60 FPS in
most games and usually lower. People that "tune" their system usually turn this off to allow the game to
render as fast as it can, thus making more FPS. For some people there is no stretching and barely any
lickering. But for a lot of people their screens get stretched so far that the game becomes
unplayable. Sure, performance is gained but what good is it if you can't see the entire screen? But some
people are arguing that even with Vsync on, you can get over 60FPS if the refresh rate is 60Hz. Let me
put that to rest by telling you, that is not true at all. If someone claims that they get 200 FPS in a game
that either means that they have a 250Hz refresh rate or Vsync is off and they don't know it.

If, on a given hardware platform, an application is capable of rendering at, say, 75 FPS with vsync off,
optimal refresh rates with vsync on will likely be 75Hz and 150Hz - clean multiples of the raw rendering
frame rate, that come closest to syncing screen refreshes with page flips. At 60Hz or 120Hz, an
application that renders at 75 FPS with vsync off, is likely to deliver only 60 FPS with vsync on. The
"Optimal" refresh rate is not, therefore, the one which Windows chooses, or the highest your monitor
can reach at any given resolution - its the one that comes closest to syncing with page flips. Although
experimentation with different refresh rates is required, it is possible to approach the raw rendering
speed you get with vsync off, but without suffering the visual degradations of tearing.






 

Lorne

Senior member
Feb 5, 2001
873
1
76
Thats a load, Refresh on your monitor has nothing to do with frame-rate limitations.
Refresh on a monitor only stops flicker, (in a can) It takes more power to run at higher resolutions and if the monitor does not supply enough you get a little bit of flickering so the next step would be to up its screen refresh faster to fool your eyes.

Vsync (on or off) for your GFX, If ON the puter will wait for acknowledgment from teh card before sending the next set of info, If OFF the puter will send it as soon as possible but if your card cannot processe it fast enough it dumps and starts the next set of instructions which is tearing.

The reason you show better framrates or scores in benchmarks with Vsinc off is there is no latency between the next frames sent if it had to wait for the card to finish, What these programmers need to intagrate into these programs is a framelost count, Because you get high scores or framrates doesnt mean it looks its complete best, More as what your puter and gfxcard can do max at its present settings.
I get well over 60fps with my TnT2U with 60hz monitor refresh.
If you look around hard enough Im sure you can find a windowz version of what I think is called Mref (dont remember so long ago), it is a unlocked GUI version of the stock windowz PCI VGA driver, It lets you change the monitor refresh on the fly from 1-250hz, its a no-no in the wrong hands since it can pop a monitor so fast if it doesnt have built in protection as a few dont.
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0
After two years, its becoming quite annoying to be corrected by morons who nothing about which they talk.

I will give you from Anand Lal Shimpi himself. I will not further engage your stupidity:

AT link




Review Index:



I didn't bother posting to many performance scores since the performance of the AOpen PA70 is very similar to that of the Savage3D board Anand looked at in the large Video Accelerator Round-up. If you are interested in in-depth benchmarks on various games/etc. check this article out.

The Performance of the Savage3D is good. It isn't a Voodoo2 killer (In D3D games it actually does significantly outperform the Voodoo2) but performance is up there. Keep in mind that this board was running at 110mhz, not 125mhz like some of the review Savage boards that are floating around the web.There is however a problem with the Savage3Ds performance.

Vsync On/Off

The tests were conducted with Vsync Off to ensure that the frame rate was not being limited by Vsync (i.e. awaiting for Monitor refresh before drawing screen). Unlike some cards, which suffer minimally when turning Vsync On/Off, the Savage 3D's performance dropped insanely when I turned Vsync On. 800x600 Performance dropped from 42.6fps to 35.2! 1024x768 dropped from 27.3 - 23.7, 640x480 frame rates dropped to 43.9! (from 55.7). What does this mean? Well, in order to understand the significance of this large drop, you must first understand how frame rate is calculated. Frame rate is calculated as # total frames / amount of time. The frame counter is incremented every time a frame is drawn. We can also look at the FPS as Amount of Frames / certain amount of time, let us shrink this interval of time down to 1 second, and we get Frames / a Certain Second. Now we add up All the values from Frames / a Certain Second and divide by total number of frames. The problem is that, no matter how fast the Savage can actually render certain scenes, it will be capped by a 60hz (refresh I used for OpenGL tests) refresh rate. If a card did a consistent 40 frames / a certain second, whether or not Vsync was On/Off would be negligible. However, a card which very inconsistent performance, lets say anywhere from 15 to 150 frames / a certain second, will suffer greatly from Vsync On since every time the card hits above 60 frames / a certain second, the number will be capped to 60. Lets say we had a 3 second demo which Ran at (Vsync/off) 9fps / second1, 150 fps / second2 , 15 fps / second3. (Slight exaggeration of performance of any real card) This would yield an average FPS over the entire demo of 58fps (Vsync/off) Turn Vsync On in this test, and the 150fps / second2 will be changed to 60fps / second2. This will change the average FPS to 28fps, a HUGE difference. While the consistency of the Savage3D is much greater than the consistency in the example above, The Savage3D is still very inconsistent compared to other cards. (Most cards lose 2-3fps at most, provided Vsync/off performance is lower than refresh rate, which is the case with the Savage) This low consistency means that the Savage3D FPS can dip to extremely low rates, an issue Tom, of Tom's Hardware discusses more.



Also scroll down to vsync explanantion here


And Guru3D : see #4