Video cards, I can't SEE the difference...:(

BigEdMuustaffa

Golden Member
Jan 29, 2002
1,361
0
0
Hi, I have used numerous different video cards to play Unreal, Quake II, etc. and frankly I can't see any visual differences. The EVGA Geforce mx 32meg doesn't put an output that's any different than the Visiontek GeForce3 ti 200 IMHO. I ran them both at 1024x768...am I missing something? I expected a grand difference with the visiontek. Maybe I'm not doing something right?
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Well uh theire both rendering the same thing so of course it wont be different.... the GF3 will be faster.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Run both at 1024*768 but w/ the GF3 put on 2x AA and 32bit color and see which one runs better/looks better :p
 

sash1

Diamond Member
Jul 20, 2001
8,896
1
0
Those games are quite old, and quite frankly, there is nothing to improve upon. With your Ti200, enable anisotropic filtering, run at a higher resolution, and give it some AA. The cards are rendering the same images, so there is absolutely no difference. It would look the same even on a VooDoo2. Only difference is the speed.

~Aunix
 

CubicZirconia

Diamond Member
Nov 24, 2001
5,193
0
71
Originally posted by: BigEdMuustaffa
ok, set it to 2aa and tried 4aa, in the original Unreal, it looks pretty much the same? Am I wrong?

Some games don't really look much better with aa on (lots of old games). UT is the same way.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) If you're talking image quality then there is little diff between GF2 and GF3, GF4 are the much better ones for that. In any case the diffs are often small when gaming and you need to be sure an optimal refresh rate is being used for your monitor, WinXP esp defaults to 60Hz which is pretty nasty. Don't forget to o/c that GF3TI200 as they are intentionally crippled and are capable of far faster clocks, it was in order to promote sales of the GF3TI500, TI200 should reach very near TI500 speeds quite easily.

:D If you are talking 3D perf then GF3 is much better than GF2, the newer the game the bigger the diff. GF3 offer much better AA and Aniso as well as full DX8 hw support. Crank up the res, the AA, Aniso and detail sliders and you should be cool!
 

BigEdMuustaffa

Golden Member
Jan 29, 2002
1,361
0
0
Weird, I thought for a moment, I saw a refresh rate setting of 140mhz possible, then I installed the drivers for the 17" Phillips 107p and highest I can get is 100mhz. I read 85mhz is optimal for 1024x768, so I set it to 85, is that right?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Well even CRTs (standard monitors) have their native resolution which often uses the 85Hz refresh rate. However, generally you should always use the highest refresh rate (Hz) that your monitor supports (or works with LOL). If your monitor can do 1024x768x32 @ 100Hz then it is most likely it does 1280x960x32 @ 85Hz and this is it's optimum resolution. To be honest modern monitors largely illiminate moire and other non-native deficiences and as such simply use the highest setting your montior supports or can work with, as the resolution increases the attainable refresh rate decreases. In any case 75Hz is generally considered to offer flicker free output, 85Hz simply gives a bit more breathing space while anything above this is simply 'nice to have'. Just MHO ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,997
126
Hi, I have used numerous different video cards to play Unreal, Quake II, etc. and frankly I can't see any visual differences.
Crank up the details levels on all cards and you'll see the slower ones drop to a slideshow quicker than the newer cards will.

Also modern games using DirectX 8x/OpenGL 1.3 features will have more features and higher image quality on the programmable pixel/vertex capable video cards while those features will be missing on the older cards.