Need help understanding Refresh rates

cotton

Member
Sep 12, 2000
89
0
0
I dont quite understand about refresh rates. My monitor supports a maximum refresh rate of 85Hz at 1024 x 768 which is what I run.
I can also choose Optimal or Adapter Default when running at 1024 x 768, I have a Voodoo 3 3000 AGP card. Heres the questions:

1. Is it better to choose manually choose 85Hz, Optimal or Adapter Default.

2. When I play Unreal Tournament and go into the Preferences in the game, I can change all the way up to 1024 x 768, 1280 x 968 and 1600 x 1200. If I choose above the 1024 x768 what happens what refresh rate would I be running at since the monitor doesnt support above the 1024 x 768. Is it better to pick 1024 x 768 in the game as well ?

Thanks
 

Workin'

Diamond Member
Jan 10, 2000
5,309
0
0
If you KNOW what refresh rates your monitor+video card will support, manually choose the highest refresh rate that gives you the best undistorted picture. 1024x768@85 Hz should be plenty good.

If you pick a resolution that is out of range for your monitor, you will see a scrambled picture. If you choose a horizontal refresh rate that is out of range you may see a scrambled picture, or a really squashed picture, or your monitor may never work again! Some monitors really don't like out of range horizontal signals, and fry if you accidentally apply one.

You should play your games at the highest resolution that gives you adequate frame rates, and is also within the capabilities of your hardware.
 

HigherGround

Golden Member
Jan 9, 2000
1,827
0
0
Workin' explanation is correct, i'll just add one more thing...if you are using vendor drivers for both video carrd and the monitor (i.e. monitor will be named based on the manufacturer's name (i.e. Viewsonic PS790) and will not say "Generic P&P monitor), then Win98/ME will only show refresh rates that are valid for the selected monitor and prevent you from selecting refresh rates that is out of its range. Usually it's best to select rates as high as possible. I believe that is what "optimal" setting does for you.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
One other possibility, besides those that Workin' listed(though those were all good examples), that may happen if you select something out of range, your monitor may flash a "Signal Out of Range" warning(that is what mine does).
 

Dufusyte

Senior member
Jul 7, 2000
659
0
0
Setting it to "Optimal" will give you the best refresh rate that your monitor and video card can do.

When you choose high resolutions, the refresh rate always goes down, which creates more flicker. Lower resolutions (=higher refresh rates) provide a more solid image, which is why I like to game at low resolution (higher refresh rate). It's easier on the eyes.

Plus, the lower resolution allows you to see more frames per second. If someone is gaming at 1024, and their refresh rate is, say, 75hz, it means that their monitor is only displaying 75 fps (regardless of what the video card is theoretically producing). But if you are gaming at 640, and your refresh rate is 100hz, then your monitor is capable of displaying 100fps, so you can actually see more of the frames that your video card is pumping out.
 

Workin'

Diamond Member
Jan 10, 2000
5,309
0
0
Dufusyte you are not really correct.


<< When you choose high resolutions, the refresh rate always goes down, which creates more flicker. Lower resolutions (=higher refresh rates) provide a more solid image, which is why I like to game at low resolution (higher refresh rate). >>


It is not always true that the refresh goes down at higher resolutions - it totally depends on your video card and monitor. Obviously, some have better performance than others.


<< Plus, the lower resolution allows you to see more frames per second. If someone is gaming at 1024, and their refresh rate is, say, 75hz, it means that their monitor is only displaying 75 fps (regardless of what the video card is theoretically producing). But if you are gaming at 640, and your refresh rate is 100hz, then your monitor is capable of displaying 100fps, so you can actually see more of the frames that your video card is pumping out. >>


Only true if you have vsync enabled! I don't know anyone who plays a first-person shooter game like Quake, etc. with this setting enabled. With vsync turned off the video card renders as many frames per second as it can, and they are displayed as fast as they are rendered on the screen, sometimes only a fraction of a frame is displayed before the next one is drawn, that is why you sometimes have &quot;tearing&quot; when vsync is off.

EDIT: darn new forum tools!