Improve the 2D on your NVidia card - no surgery required...

Napalm

Platinum Member
Oct 12, 1999
2,050
0
0
Today I read an interesting thread at the newsgroup alt.comp.periphs.videocards.nvidia started by Evil Ally. It has some interesting info regarding why the 2D on the NVidia cards blows chunks and one guy gives a potential fix that does not involve surgery. He suggests turning down the refresh rates.

I had all my refresh settings to my Samsung 900NF cranked (i.e., up to 120MHz), so I tried them all at 75MHz. I was amazed at the difference in clarity. For me, it seems that anything above about 80Mhz starts producing a bit of blur, and I can't see any flicker.

Anyway, don't spend as much time here as I used to, so if this is old news then pardonez moi. ;)

Napalm
 

Jonny

Golden Member
Oct 26, 1999
1,574
0
76
Yeah, but that kinda sucks though dont you think? We all love higher refresh rates, and it sucks if we loose quality going up. Nvidea has some serious work to do if you ask me.
 

Wizkid

Platinum Member
Oct 11, 1999
2,728
0
0
I have a Hercules Geforce2 MX and I don't notice 85 Hz any more blury than 75 Hz @ 1600x1200.....

Thanks for the tip anyways though, it will probably help some people :)
 

Napalm

Platinum Member
Oct 12, 1999
2,050
0
0
Wizkid:

Go down to 1024x768, set your refresh to 120Hz and then try it at 60Hz. Besides the horrid flicker at 60Hz, do you notice that the image is incredibly sharper??

Napalm
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
Hi,

In my experience this holds for ALL normal videocards on ALL CRT monitors to some extent. Even with my old Stealth videocard, with several displays, I had the same effect. 85Hz was *slightly* blurrier than 75Hz, and 60Hz was always the sharpest.

Better monitors/videocards reduce the extent of the problem. However, it seems quite consistent across hardware. For all I know, this may be the eye's reaction to changing brightness caused by flickering.
 

Oreo

Senior member
Oct 11, 1999
755
0
0
This is true for all videocards and CRT monitors as Leo V said. When you get up to the higher resolutions (depending on what size monitor you have) it´s almost always sharper image at lower refresh rate. Even though this is true I never go below 85Hz if I can avoid it, the flickering is more disturbing then a little blur.
 

Napalm

Platinum Member
Oct 12, 1999
2,050
0
0
Leo V:

While that may be true, my previous video card (a lowly V3 2k) did not require me to decrease the refresh in order to get a sharp image. Unfortunately, my current NVidia geForce-based ASUS V6800 does. Just goes to show that the old addage "you gets what you pays for" does not always hold... ;)

Napalm
 

2dfx

Member
Sep 3, 2000
36
0
0
Yes ,I was running my Powercolour GF2MX at 1024-768 @ 85hz desktop and it was a blurry mess. I had a v3 3000 before and it was perfectly fine. I changed the GF2MX back to 75hz a few weeks ago and it was a hell of a lot better. This is a video bandwidth problem, and surgery IS required to fix it if you want higher resolutions and memory bandwidth.

http://www.geocities.com/porotuner/
For those who don't know here is the surgery procedure. :D
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Man, does 1152 x 864 @ 100 Hz look great on this Radeon!! My Voodoo 3 was also just as sharp. Don't understand what all the fuss is about.
 

HigherGround

Golden Member
Jan 9, 2000
1,827
0
0
I also have absolutly no complaints with 1600x1200x32 @ 85Hz on a GTS and a 21" display. And yes I tried the acclaimed G400 at the same resolution/frequency on the same monitor with no visible difference.
 

Zach

Diamond Member
Oct 11, 1999
3,400
1
81
This is true for all video cards in my experiance. On my Matrox G200, I have to keep my 1280x1024 19" monitor down to about 90Hz, I like'd 120 but it hurt after a while. My Viper V330 (Remember the Riva 128?) can look good at my 17" monitor's max of 1024x768, but 60Hz default is an obviously cleaner image. My poor banshee card can really get bad at higher resultutions an refresh rates..
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
My GeForce2 GTS + Sony G400 19" work perfectly at 1280x960x32 @100Hz, or at 1600x1200x32 @85Hz also. However, the lower refresh/sharper image phenomenon is still observable. The difference is, with better hardware it's much less noticeable. With my setup, the loss is negligible--but the refreshrate improvement certainly isn't.

I remember sharkeeper saying that even the same brand/model videocards fluctuate significantly in 2D quality! :) (I've heard from people amazed at Asus GeForce cards' sharpness too!)
 

Zucchini

Banned
Dec 10, 1999
4,601
0
0
Yes, bad rap getting annoying, my hercules prophet ddr and g400 @1600x1200@85hz very %@# nice.
 

Spoooon

Lifer
Mar 3, 2000
11,563
203
106
I have my monitor set at 75hz and notice no flickering whatsoever. I've never really complained about my image quality either.

VisionTek GeForce 2 MX
 

superbaby

Senior member
Aug 11, 2000
464
0
0
Sigh you maniacs.

Refresh rate is nothing something higher = better. If your image is sharp, clean and doesn't cause any eyedrops to drop down your cheek than your refresh is fine. A general rule is 60hz is too low, but 120hz is too high for resolutions like 1024x768 and 1280x1024. Past 1600x1200 it all becomes subjective to the monitor, and you should experiement what is best.

There is no holy grail of refresh rates at insane resolutions, but for "normal" resolutions 75 is best. 85hz is pushing it too high, you'll start to get artifacts and flickering. Just see what you like! If your eyes get too tired you should switch your refresh.

I find that 75hz is a pretty good magic number for me, I use it for everything.
 

vailr

Diamond Member
Oct 9, 1999
5,365
54
91
Listening to distant(non-local) AM radio located close to computer, 85 Hz refresh rate produces much buzzing interference in the AM audio, whereas 75 Hz refresh rate is much less noisy. This is with a 3dfx V3 video card & Sony monitor. Also, the settings are in Hz, not MHz.
 

Leo V

Diamond Member
Dec 4, 1999
3,123
0
0
85Hz is superior to 75Hz where eyestrain/ergonomics are concerned. In fact, there is a minor improvement at 100Hz. Varies from CRT to CRT, depending on the screen's darkening speed.

I must point out, however, eyestrain ISN'T the only important effect of refreshrates. The other one, which causes me to prefer 120Hz whenever available, is motion fluidity. Given fast enough hardware (very fast), the framerate is effectively limited by your refreshrate capability. In rapidly-panning games like Quake, there IS an improvement at 100Hz/100FPS over 85, and in fact I can discern 100Hz/100FPS from 120/120, when I rapidly spin in Quake. It's something you get used to, at which point 60FPS doesn't feel so smooth anymore :)