ATI vs NVIDIA image quality

Jan 31, 2005
32
0
0
hello,
NVIDIA has faster 3D cards (i.e. 6800 Ultra), but I heard that ATI's cards have much better 2D image quality. I don't want to buy a card which is very fast in 3D, but sucks in 2D. I do need to read text and write doc on my computer, you know? Please confirm if this is true. Thanks!
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
i have an ati card, but i would bet its hard to notice a difference in 2d quality. tons of people have nvidias and youd hear if there was a 2d problem. there might be certain monitor / resolution / refresh combinations that have problems, but you should be fine with either one.

JB
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: TheOasis
i dont get how any modern card could suck in 2D? whats so advanced about it?

Mostly the issues have been (from what I understand) OEM card manufacturers using cheap signal filtering components to cut costs. This can cause excessive signal noise or instability in the RAMDAC outputs at high resolutions and refresh rates. I believe this was a problem on some GeForce4 and GeForceFX cards -- the GeForce6 lines seem to be better in this regard. ATI manufacturers tend to not do this, possibly because there are fewer of them and thus less cutthroat price competition between them.

Dropping $5 from the manufacturing cost of a video card doesn't sound like much, but when you sell a couple million of them...
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
There is really no noticable IQ difference between either manufacturer these days in 2D or 3D. I have heard Matrox's 2D is light years ahead of ATi/NV, but have never witnessed it, so I couldn't tell you.
 

TheOasis

Banned
Feb 11, 2005
157
0
0
what makes ATi and NV's 2d so bad if matrox is "lightyears ahead" of them? i dont get the need for a serious 2d card... wouldnt a regular GF 6200 do fine?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: TheOasis
what makes ATi and NV's 2d so bad if matrox is "lightyears ahead" of them? i dont get the need for a serious 2d card... wouldnt a regular GF 6200 do fine?

Well, Matrox cards always use VERY high quality components for their video outputs. Years ago, most 'consumer' graphics cards used really shoddy parts, and so there was a noticeable difference. However, I'm not so sure that they're any better than a good ATI or NVIDIA card these days in terms of 2D quality.

And on a DVI output, there should really not be any effective difference. It's a completely digital signal, so any superior analog filtering Matrox is doing goes right out the window.
 

TheOasis

Banned
Feb 11, 2005
157
0
0
but my question is, why would you need an expensive card for 2D? couldnt a 2 year old card do 2D just as good a a new matrox card?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
We're talking about 2D output quality, not speed. Every card for the last 5-10 years has been ridiculously fast in 2D modes already. When a card was made has little to do with whether or not its VGA outputs are noisy (other than that newer cards *tend* to be better, but not always if the manufacturer is cutting corners).
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
Image quality really depends on signal quality(incorrectly called "2D" quality, because it affects everything that goes on to the screen). This in turn depends on the particular vendor, rather than the GPU manufacturer. You can slap cheap filters on a new GPU and a 5 year old Voodoo3 will look better than it. I have a cheap Radeon 9550 that has significantly worse 2D image quality than an Abit GF4MX440 or even a Radeon 7000. I have another Radeon 9600Pro(almost the same GPU) that has great 2D output.

Of course, all these differences are apparent only on the analog VGA output. If you're outputting via the DVI connector, you're eliminating almost all the differences, and signal quality becomes a non-issue.
 

3chordcharlie

Diamond Member
Mar 30, 2004
9,859
1
81
Back in the TNT / Rage pro days, i could definitely see that most ATI cards had better 2D output, and some people claimed to see a difference right up until 1-2 generations ago.

My TNT2 card definitely has less than spectacular output, but everything I have newer than that has been perfectly fine.
 

Oyeve

Lifer
Oct 18, 1999
22,076
887
126
My matrox millenium 4 megger looks just as good in "2d" as my 9800 Pro. Its really a no brainer today.
 

Deskstar

Golden Member
Mar 26, 2001
1,254
0
0
As above, some of the earlier (ie few years back) Nvidia cards did have poor filtering so that their 2D image quality was less than ATI at that time. Matrox had the best 2D then. I have owned all of these cards. Currently, however, the upper line cards of Nvidia and ATI seem virtually identical to my eye in 2D. I have dropped my Matrox card, but I cannot imagine that its 2D is any better than what the recent Nvidia and ATI cards currently provide with their upper end cards.
 

MGMorden

Diamond Member
Jul 4, 2000
3,348
0
76
This problem was more common "back in the day". It's cause was mainly bad RAMDACs(the portion of the video card that creates an analog signal from the digital signal on the card). When the issue was common, Nvidia just sold chipsets and had no control over parts used by the OEM manufacturers. As a result a lot of cheaper components could be found in Nvidia cards. ATI cards at the time were all made by ATI. That has changed now however, and you're not guaranteed to get top notch stuff anymore when buying an ATI.

Of course if you're using a DVI-D connection then the issue issue goes away, as the RAMDAC on the card is no longer even used.
 

LED

Diamond Member
Oct 12, 1999
6,127
0
0
Next to Matrox, ATI was top dog when they purchase TSeng Labs and used their 2d Tech but after NVIdia aquired 3DFX then they causght up so 2D is pretty much equal between all of them.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Insomniak
There is really no noticable IQ difference between either manufacturer these days in 2D or 3D. I have heard Matrox's 2D is light years ahead of ATi/NV, but have never witnessed it, so I couldn't tell you.

I have heard that as well, but it seems to me I read a review of the fabled Parhelia that said it was about the same, even back then
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
IIRC, there have been reviews that measure the rise/fall times, frequency response/etc of the signals coming out of the VGA output on an oscilloscope, and the Parhelia was way above average, but IIRC it wasn't the best. IMHO as long as you stay away from unheard of manufacturers, or those that don't follow reference design, you should be safe. RAMDACs are mostly integrated on the chip nowadays so it's the filters the make the difference.