Keeping it balanced

beatle

Diamond Member
Apr 2, 2001
5,661
5
81
Sadly, my 64 meg GF4 Ti4200 is getting on in age. I have it overclocked within a few mhz of Ti4600 speeds. With all this talk of new games and the like, I'm predicting I'll need a bit of a boost, especially with ony 64 megs of video memory.

When I bought my Ti4200, I had a tbird @ 1266 and a GF2 GTS. Nothing ran faster, but I could turn all the eyecandy on and run higher resolutions. Not until I upgraded my cpu did I see an improvement in framerates.

I don't jack the resolution up all the way, but I do like to play with all the eyecandy on, especially in single player games. I'm aiming for 60+ constant fps in 1024x768, no AA, maybe 4 or 8x AF. With the P4C @ 3.09 rig in my sig, what would be a video card to mate with it and give me the performance I'm looking for without breaking the bank (under $200)?
 

Arcanedeath

Platinum Member
Jan 29, 2000
2,822
1
76
A 9800Pro w/ VGA Silencer looks like your best bet, OC away and it shouldn't break the bank :)
 

Mullzy

Senior member
Jan 2, 2002
352
0
0
9800Pro all the way (good frames per second to $$ ratio)

One word of warning though (I did the nVidia -> ATI switch a couple months back): If you used nVidia's Digital Vibrance feature and gotten used to it... you will have a hard time adjusting to an ATI. No amount of contrast, brightness, gamma fiddling can come ANYWHERE NEAR touching the color enrichment from nVidia's DVC. If you never used DVC, don't turn it on now or you'll be sorry when it's gone.
 

beatle

Diamond Member
Apr 2, 2001
5,661
5
81
Thanks. I'll keep these cards in mind when I see just how bad D3 and HL run on my system.
 

dmw16

Diamond Member
Nov 12, 2000
7,608
0
0
D3 and HL2 will run pretty badly I would imagine :)

No reason not to wait tho...prices can only come down. I would check the forums the week D3 comes out. I predict some people unloaded 9800pros or XTs so they can crank everything in D3 way up.
-doug
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,208
126
Originally posted by: Mullzy
9800Pro all the way (good frames per second to $$ ratio)

One word of warning though (I did the nVidia -> ATI switch a couple months back): If you used nVidia's Digital Vibrance feature and gotten used to it... you will have a hard time adjusting to an ATI. No amount of contrast, brightness, gamma fiddling can come ANYWHERE NEAR touching the color enrichment from nVidia's DVC. If you never used DVC, don't turn it on now or you'll be sorry when it's gone.

You're the second person on these boards that I've seen mention that they were a fan of DVC. I tried it, when I had a GF2 MX in my box, and didn't see much benefit, it seemed like things became slightly oversaturated due to it. (My preference for most displays is to turn down the brightness, saturation, and contrast, which is I guess the opposite of what most people prefer.)

I am curious though, how is it implemented? It can't be "magic", but I know that it's not supported on any NV chipset older than the GF2 MX (meaning, the original GF2 doesn't support it).

It seems almost like a gamma setting, but more like the H/S/V settings for controlling TV-out, except applied in the digital domain to the RGB signal emanating from the RAMDAC. I would think, that such a thing should be fairly trivial to do, for all modern cards. Heck, even the newest ATI cards can actually adjust the power-output of the RGB signals going to the monitor.

I guess in a roundabout way, I'm musing why this can't be implemented in all modern video-card driver sets. (I'm using a Radeon 9200 at the moment.) Unless NV has a patent on it?
 

Mullzy

Senior member
Jan 2, 2002
352
0
0
VirtualLarry,

I used DVC just barely turned on and loved it (both on my GF2MX and ti4200). Basically I'd slide the setting from OFF until I could see the smallest amount of color enrichment (maybe 20% along the slider). I watch a lot of movies and play a lot of games and was really frustrated by how washed out everything seemed on my 9800Pro in comparison. KOTOR lightsabers seem to barely have any color for example

The next time I am ready for a GPU upgrade I hope that nVidia and ATI are more even on the $$->FPS scale so I can get my DVC back. :D
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: VirtualLarry
Originally posted by: Mullzy
9800Pro all the way (good frames per second to $$ ratio)

One word of warning though (I did the nVidia -> ATI switch a couple months back): If you used nVidia's Digital Vibrance feature and gotten used to it... you will have a hard time adjusting to an ATI. No amount of contrast, brightness, gamma fiddling can come ANYWHERE NEAR touching the color enrichment from nVidia's DVC. If you never used DVC, don't turn it on now or you'll be sorry when it's gone.

You're the second person on these boards that I've seen mention that they were a fan of DVC. I tried it, when I had a GF2 MX in my box, and didn't see much benefit, it seemed like things became slightly oversaturated due to it. (My preference for most displays is to turn down the brightness, saturation, and contrast, which is I guess the opposite of what most people prefer.)

I am curious though, how is it implemented? It can't be "magic", but I know that it's not supported on any NV chipset older than the GF2 MX (meaning, the original GF2 doesn't support it).

It seems almost like a gamma setting, but more like the H/S/V settings for controlling TV-out, except applied in the digital domain to the RGB signal emanating from the RAMDAC. I would think, that such a thing should be fairly trivial to do, for all modern cards. Heck, even the newest ATI cards can actually adjust the power-output of the RGB signals going to the monitor.

I guess in a roundabout way, I'm musing why this can't be implemented in all modern video-card driver sets. (I'm using a Radeon 9200 at the moment.) Unless NV has a patent on it?

If I had to guess, I'd say ATI doesn't have it because it's somehow distorting the color information in a nonstandard way (maybe doing some sort of nonlinear saturation adjustment based on brightness?). It definitely makes the colors less 'true', but many people prefer an oversaturated, overbright display (that's why they set up the TVs like that at big electronics stores), because it 'looks' more, well, vibrant. :p