fps vs Hz monitor settings

Zenobia

Member
Aug 11, 2003
51
0
0
Tom's HW guide has a chart for video cards using AMD Athlon cpu's found here: http://www4.tomshardware.com/graphi...02.html#aquanox
and for the test it's using Aquanox @ 1024x768 @ 32bit @ 60Hz.

Aquinox is a grapically busy game, 1024x678 and 32 bit are high end, so why 60Hz for a monitor setting? Why so low? What's the connection between game fps and the Hz setting for your monitor, if any?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Usually the defaults for benchmarks are something like that, so that *every* monitor can run it at the same settings. Not all monitors can display really high refresh rates -- many cheap ones are limited to 75-85Hz anyway, even at 1024x768.

There's no relation between fps and refresh rate, unless you have vsync on. Vsync forces your video card to redraw between screen refreshes, which prevents texture tearing and certain other visual problems. It does cut performance a bit, and of course limits your maximum fps to your refresh rate, but it looks much better (to me, at least).
 

Zenobia

Member
Aug 11, 2003
51
0
0
Thx!

So, if I got my monitor set at 75Hz refresh rate, games will run at that if they can? Really, that's fast enough for me!
 

TheMouse

Senior member
Sep 11, 2002
336
0
0
Originally posted by: Zenobia
Thx!

So, if I got my monitor set at 75Hz refresh rate, games will run at that if they can? Really, that's fast enough for me!

I dont see why Tom benchmarks with his monitor at 60hz. It will not effect the results/performance of the game or video card. But, if you're playing a game, and its runnin at 120 fps, but your monitor is set to 60hz, you will only see every other frame (theoretically)... since your monitor wont be able to flash 120 screens per second, just 60 screens per second. The higher the Hz and higher the FPS, is always better (technically, maybe not practically)... so if you're monitor is running at 75 hz, and you get 75 fps, thats good... if you're monitor is at less hz than your fps, than you're not seeing all the frames that you computer is pumping out. If you're hz is higher than fps, then yur monitor will flash the same frame more than once until it recieves the next frame.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Benchmarks are quite obviously run with vsync off, otherwise they'd be worthless with the double buffering all 3D cards default to. Maybe THG (Lars?) is using an LCD (hopefully 16ms) as the test monitor, in which case 60Hz makes perfect sense.

Synthetic benchmarks like AquaMark and 3DMark have vsync disabled for obvious reasons: they're made to test speed, not image stability.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
I dont see why Tom benchmarks with his monitor at 60hz.
NT based OSes default to 60 Hz in 3D games and I suspect that Tom's doesn't want to use any of the utilities to change it in case they influence the benchmarks in some way.

It dosn't really matter what the refresh rate is; as long as vsync is off it'll have absolutely no effect on the benchmark results. However some of the memory-resident utilities to change it could affect the results so Tom's have simply taken the safest option.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
If your monitor is at 60Hz and you get 120fps, that means that per each refresh, you will see 2 frames.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Actually, you'll see half of each frame, which causes annoying visual artifacts if the viewpoint is moving (such as texture tearing). That's what Vsync is for.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Acutally, I was wrong on that, if you have 120fps@60Hz w/vsync on, you will only see 60 frames out of those 120 per second, even if vsync wasn't on. And it depends on the timing, if you see a half a frame or not, its not necessarily all the time.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
if you have 120fps@60Hz w/vsync on, you will only see 60 frames out of those 120 per second, even if vsync wasn't on.

Try that again in English.

If your FPS is higher than your refresh rate, you will almost constantly get annoying visual artifacts unless you have Vsync on. If the video card is updating twice per monitor update, it *has* to be changing the picture in the middle of the refresh somewhere, which causes these artifacts. End of story.
 

AnMig

Golden Member
Nov 7, 2000
1,760
3
81
maybe he was using a LCD

sorry did not check his review.

if that was the case then refresh rate is mute
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Which doesn't do jack for you when you change the image being displayed on the screen in the midst of a refresh. Double/triple buffering lets you do multipass rendering at rates independent of refresh without causing flickering, but doesn't fix synchronization problems between the video card and monitor.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
OK, but the picture still isn't gonna change until the next refresh. I didn't say you weren't gonna get artifacts. Maybe what happens is that the monitor has some sort of buffer or way it processes data. So the video card is still sending out the frames, but they don't show up until the refresh. The tearing is when one frame overwrites another, and when it then it refreshes during that thing.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Yes, it will. If it didn't, there would be no need for vsync. This is the crux of the entire problem.

Edit: and now you're changing your post.

Look, I'll try to explain.

The graphics card renders to a "frame buffer" (a chunk of memory set aside for this purpose), which is then scanned and sent out to the monitor, which draws it on the screen. However, this process is not instantaneous, and the number of times per second that the monitor redraws what's in the frame buffer is known as the "refresh rate".

Nowadays, you usually switch back and forth between two or three frame buffers, so that you don't get flickering or artifacts if you have to draw an image in multiple steps. This is called double/triple buffering, and is not really related to the problem we're talking about in this thread.

In all modern video cards, it is possible to write to the frame buffer while the monitor is drawing it to the screen, in order to minimize delays between frames. When you do this, invariably you end up rewriting the frame buffer in the "middle" of the refresh cycle at least some of the time -- even if you are not drawing more frames per second than the refresh rate. When this happens, on average half of the screen (some upper portion of it) will be from the old frame, and the other part will be from the new frame. If the frames are significantly different, you will get artifacts at the border between the two images. If this happens many times per second, you get something called "texture tearing", where you will perceive a sort of jagged line (or set of lines) flickering on the screen where the border is. This is most pronounced in 3D games, where anisotropic artifacts subtly change as you move your viewpoint around in a 3D environment.

Vertical Synchronization, often abbreviated to "Vsync", is a method to avoid this. It is most useful in a double or triple buffering system, as you get quite a bit of slowdown in a single-buffered video subsystem (you can't draw to the frame buffer at all while refreshing, whereas in a double buffered one you only have to wait to do the switch). Basically, it forces the video card to wait until the monitor is finished with its current refresh cycle before switching frame buffers. This prevents texture tearing and other artifacts, but also limits your framerate to match your refresh rate and sometimes lowers FPS a bit (since, on average, a frame will have to wait 1/2 of a refresh cycle to be drawn). As I find these artifacts to be very distracting, and the monitor cannot really display more than that many frames per second anyway, you almost always want this on unless you are benchmarking.

I hope that clears things up a bit.

And you can't really talk about video cards and refresh rates *without* talking about the monitor and how it fits into the picture, so I don't see why you're objecting to it.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
It changes somewhere, I know, but it doesn't change the fact that your refresh rate is still what it is a refresh, and a sort of frame rate and nomatter what, your gonna see the amount of frames are your refresh rate. A frame rate itself is a refresh rate. A refresh rate overwrites the image and as a frame overwrites a frame.

Lets call refresh rate something else - like TVfps. and The VCfps - Video card fps.

You are seeing TVfps and whenever a new TVframe shows up, it will show you whatever is happening in the frame buffer of the VC a frame or a frame overwriting another frame.

Think of it like you moving a mouse on a game that is working at 5-10fps. You move the mouse and you experience a delay and then it doesnt even show the mouse moving, but rather the new position of the mouse.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: VIAN

Think of it like you moving a mouse on a game that is working at 5-10fps. You move the mouse and you experience a delay and then it doesnt even show the mouse moving, but rather the new position of the mouse.

No -- if it did this it would be *fine*. The problem is that you'll sometimes get a frame drawn where half the mouse is in the old position and half is in the new position. That's what the whole problem is.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I really dont understand what we were arguing about. I agree with everything that you said.

I think we both had the same idea. When I said that 120fps@60Hz, that you will only see 60 out of the 120 frames, I didn't mean they were complete frame or even talking about those frames. With vsync on the monitor will show you 60 frames from whatever it gets from the frame buffer.

You did however make my understanding better of frame buffers.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
I think we both had the same idea. When I said that 120fps@60Hz, that you will only see 60 out of the 120 frames, I didn't mean they were complete frame or even talking about those frames. With vsync on the monitor will show you 60 frames from whatever it gets from the frame buffer.

That second sentence there still makes my head hurt. "When I said that... you will only see 60 out of the 120 frames... [I wasn't] even talking about those frames." Have you considered running for President? :p

Without vsync, running 120fps@60Hz, you will see 60 monitor redraws. However, you will see at least a *part* of all 120 frames, most often seeing about half of one and and half of another. This causes artifacts like the ones I described above.

With vsync, you still see 60 monitor redraws, but you will *only* see 60 full frames, and never part of two frames together. The other half will have been overwritten in the secondary frame buffer before getting displayed (or never drawn in a single buffering system). It will be artifact-free.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Ok, nevermind, I have no clue then as to why we are arguing because I completely agree with you and what you are saying has been my idea the entire time.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If your FPS is higher than your refresh rate, you will almost constantly get annoying visual artifacts unless you have Vsync on
Actually that isn't the case at all and it depends on the framerate, the refresh rate and the actual scene itself as to how much tearing you get.

I find that I have to create artifical scenarios such as standing still in a room with a flickering light in order to see anything resembling constant tearing as during actual gameplay it's extremely rare.