• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What factors affect 3d image quality?

Executor

Senior member
I am asking this because I am planning to get a GF3 Ti500 and want a card with great performance, a quality RFI filter (improves 2d image quality) and high 3d image quality. What factors affect 3d image quality and do they differ from one manufactuer's board to another? I am strictly talking about geforce3 cards. Also does anyone know how good the 2d image quality is on the Hercules GF3 Ti500? It seems to be a good performer but the leadtek is fast and has good 2d IQ. Thanks in advance.
 
If you are looking at videocards that use the same chipset (e.g. GeForce3 Ti in your case), the quality will be the same, unless a particular videocard has buggy drivers.
 
False. The image quality of a video card almost never depends on the chipset. It's the use of crappy components (like filters) on the card that screw up the image and give an entire video card chipset a bad name... sigh

BTW, the Leadtek Winfast GF3 Ti500 (i don't remember the exact name) is one of the best looking GF3 cards right now, if not THE best.
 
VBboy and MrHelpful, while both correct, are talking about different stages of image output. VBboy is talking about the rendering stage, and the correctness/accuracy/quality of the rendering into the framebuffer. MrHelpful, OTOH, is talking about the final displaying of the image from the framebuffer, through the RAMDAC/filters into the display monitor.
 
lol, the Leadtek seems to be the best card for image quality-wise at the moment. The ATI Radeon 8500 from what I'm reading now works quite well with the newest revision of drivers.
 
Back
Top