• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Video cards, I can't SEE the difference...:(

BigEdMuustaffa

Golden Member
Hi, I have used numerous different video cards to play Unreal, Quake II, etc. and frankly I can't see any visual differences. The EVGA Geforce mx 32meg doesn't put an output that's any different than the Visiontek GeForce3 ti 200 IMHO. I ran them both at 1024x768...am I missing something? I expected a grand difference with the visiontek. Maybe I'm not doing something right?
 
Run both at 1024*768 but w/ the GF3 put on 2x AA and 32bit color and see which one runs better/looks better 😛
 
Those games are quite old, and quite frankly, there is nothing to improve upon. With your Ti200, enable anisotropic filtering, run at a higher resolution, and give it some AA. The cards are rendering the same images, so there is absolutely no difference. It would look the same even on a VooDoo2. Only difference is the speed.

~Aunix
 
Originally posted by: BigEdMuustaffa
ok, set it to 2aa and tried 4aa, in the original Unreal, it looks pretty much the same? Am I wrong?

Some games don't really look much better with aa on (lots of old games). UT is the same way.
 
😉 If you're talking image quality then there is little diff between GF2 and GF3, GF4 are the much better ones for that. In any case the diffs are often small when gaming and you need to be sure an optimal refresh rate is being used for your monitor, WinXP esp defaults to 60Hz which is pretty nasty. Don't forget to o/c that GF3TI200 as they are intentionally crippled and are capable of far faster clocks, it was in order to promote sales of the GF3TI500, TI200 should reach very near TI500 speeds quite easily.

😀 If you are talking 3D perf then GF3 is much better than GF2, the newer the game the bigger the diff. GF3 offer much better AA and Aniso as well as full DX8 hw support. Crank up the res, the AA, Aniso and detail sliders and you should be cool!
 
Weird, I thought for a moment, I saw a refresh rate setting of 140mhz possible, then I installed the drivers for the 17" Phillips 107p and highest I can get is 100mhz. I read 85mhz is optimal for 1024x768, so I set it to 85, is that right?
 
🙂 Well even CRTs (standard monitors) have their native resolution which often uses the 85Hz refresh rate. However, generally you should always use the highest refresh rate (Hz) that your monitor supports (or works with LOL). If your monitor can do 1024x768x32 @ 100Hz then it is most likely it does 1280x960x32 @ 85Hz and this is it's optimum resolution. To be honest modern monitors largely illiminate moire and other non-native deficiences and as such simply use the highest setting your montior supports or can work with, as the resolution increases the attainable refresh rate decreases. In any case 75Hz is generally considered to offer flicker free output, 85Hz simply gives a bit more breathing space while anything above this is simply 'nice to have'. Just MHO 😉
 
Hi, I have used numerous different video cards to play Unreal, Quake II, etc. and frankly I can't see any visual differences.
Crank up the details levels on all cards and you'll see the slower ones drop to a slideshow quicker than the newer cards will.

Also modern games using DirectX 8x/OpenGL 1.3 features will have more features and higher image quality on the programmable pixel/vertex capable video cards while those features will be missing on the older cards.
 
Back
Top