The myth about visual differences in 2D get a serious dent.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I can't comment on GTX500/600 series but the 2D quality increase from my 8800GTS 320mb to HD4890 was very noticeable. Going from HD4890 to GTX470 to HD6950 I couldn't tell the difference though. In the past AMD had superior 2D image quality and it wasn't a myth. After Fermi I personally couldn't detect any difference between the 2 brands. I also tried GT210 for a couple of weeks and that card was fine too.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
Brent in that thread brought up an interesting point:

Forgive me if this has been asked already, but doesn't your choice of display monitor invalidate your results based on this information?

For these tests I’ll be using my NEC MultiSync 2690WUXi monitor. It is a professional IPS monitor, designed with color accuracy in mind. It actually has internal lookup tables, so it can have correction applied to it internally, independent of the device feeding it the signal.
Am I reading that wrong, or would your display correct the lookup tables regardless of the device input. Meaning, wouldn't you need to use a monitor that doesn't do this internal correction to accurately measure the cards actual display output?
I personally have not been able to tell the difference between the two vendors for the last couple (or more) generations, but IMO the test being done there is completely invalid and proves nothing on its own.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Brent in that thread brought up an interesting point:
Forgive me if this has been asked already, but doesn't your choice of display monitor invalidate your results based on this information?

Am I reading that wrong, or would your display correct the lookup tables regardless of the device input. Meaning, wouldn't you need to use a monitor that doesn't do this internal correction to accurately measure the cards actual display output?
I personally have not been able to tell the difference between the two vendors for the last couple (or more) generations, but IMO the test being done there is completely invalid and proves nothing on its own.

You somehow "forgot" the reply:

No, all the lookup tables to is translate a given input to a given output. They are three tables of numbers, with 8-bit inputs and 12-bit outputs. So it says "when you get a value of X for red, drive the subpixel with a value of Y."

Most monitors have them, it is how you adjust colour, contrast, and so on on them, the NECs are just able to be calibrated with their tools. Also the NEC's are higher bit, so they don't cause banding or colour loss.

However they are consistent in their operation on an input. So they operate the same on it, no matter what is feeding the signal, unless you change it.

The question isn't one of what the monitor is doing, the question is if given the same input in software from the two different cards, is that then passed along to the monitor unchanged? The best way to test that would be to record the DVI signal, but I don't have anything that can do DVI or HDMI in at 4:4:4, best I have does 4:2:2 which means it sub-samples colours making it worthless for a test.

Next best thing I could do was to measure the actual colour output. If the same input on both cards generates the same output on the display, then it is quite a reasonable assumption that it was the same signal being sent.

If anyone wants to buy me a Blackmagic DeckLink HD Extreme or an AJA Io XT, plus a stupidly fast SSD, I'll be happy to use them to do a direct signal capture
smile.gif
.

I'm guessing it was an oversigt...right? *chough*
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
The 8800 series had serious issues with 2D quality particularly at lower resolutions.

2D quality - chiefly sharpness - has been a non issue for quite some time. Back in the days when Matrox was a player they were well known for their superior sharpness at higher resolutions and refresh rates. Particularly when compared to Riva128 and TNT boards, users at 1280x1024 and higher with 85Hz and higher refresh rates definitely experienced the "pain" when using those boards. The AGP G200 was far superior in 2D. When the Voodoo2 12MB came out some vendors included an inferior passthrough (DB9) cable that introduced reflections (shadows on screen) at higher video frequencies. The cards were not at fault.

Fortunately when the Voodoo3 line came out their 2D was superb. The 3D was lacking behind the competition (TNT2, TNT2 Ultra) but those boards had inferior 2D quality - softer text not as much as a shadow as the TNT boards did, however.

Thank goodness we're past all of this.
The days of worrying about text sharpness at high resolution are a thing of the past and we concentrate on sheer performance instead.

One thing that could use serious side by side testing is video rendering quality. I'm sure there will be differences amongst the two, possibly a bit more than subtle as well.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,347
1,530
136
To talk realistically about image quality, you *always* need to mention whether you are using a digital (HDMI,DVI, DisplayPort) or analog signal (VGA).

The quality of the analog signal greatly depends on the quality of the DACs provided. Matrox cards had great image quality because of good DACs, nvidia cards don't have as good DACs as AMD cards.

The quality of digital signal is always perfect on any reasonably modern GPU. There is no difference at all between them.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
To talk realistically about image quality, you *always* need to mention whether you are using a digital (HDMI,DVI, DisplayPort) or analog signal (VGA).

The quality of the analog signal greatly depends on the quality of the DACs provided. Matrox cards had great image quality because of good DACs, nvidia cards don't have as good DACs as AMD cards.

The quality of digital signal is always perfect on any reasonably modern GPU. There is no difference at all between them.

Back in the analogue only days we had choices of DB9, BNC, and DB13W3. The latter two provided superb quality.

I modified a few nvidia cards (Geforce 256 DDR) for better analog output by jumping a few resistors on the boards. The DAC was not always the issue.
 

kami

Lifer
Oct 9, 1999
17,627
5
81
That was an amusing thread. One poster in particular dug himself into a hole then when he couldn't dig any further he hired a deep core drilling team.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I always assumed that there was no difference. Image quality only increases with GPU power and in game settings.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
using a crt there is a clear advantage for ATI or at least there used to be.
 

Zorander

Golden Member
Nov 3, 2010
1,143
1
81
Back in my CRT days (up to 2009), I had come across cards that had tiny, but noticeable, smudge-like artifacts, whether in 2D or 3D. It took a lot of trouble (time, money and eyesore) to return/exchange/sell cards until I got something with no such defects. I blame it on the card's poor analog circuitry; Manufacturer's must have figured they could get away with it since most users have migrated to LCD anyway.

As soon as I upgraded to an LCD though, these issues became moot point. As for CRT vs LCD, that's an entirely different story. :\
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Can you documents this claim?
As I run CRT's..cannot stand the lower I.Q. of LDC's.
if you know me then you know I personally prefer Nvidia but the ATI cards I used looked way better on my crt than any Nvidia card. it was the first thing I noticed and something I would have never expected at the time. in fact even last year when I used a couple of comps that had integrated AMD video they looked better than my gtx260 when using the crt. I dont see any difference when using an lcd though.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
if you know me then you know I personally prefer Nvidia but the ATI cards I used looked way better on my crt than any Nvidia card. it was the first thing I noticed and something I would have never expected at the time. in fact even last year when I used a couple of comps that had integrated AMD video they looked better than my gtx260 when using the crt. I dont see any difference when using an lcd though.

VGA right?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I didn't even realize anyone was questioning the 2D image quality of modern cards over a digital link...
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
You somehow "forgot" the reply:



I'm guessing it was an oversigt...right? *chough*
I didn't forget anything. I am not even sure what is being described there is accurate or not. And it doesn't matter because from what I've seen, it is impossible to discern any difference between the two vendors, and I've seen both driving some expensive, high end displays. I personally have a fairly expensive plasma and I've driven it with both an Nvidia and AMD card, I didn't see any image fidelity differences at all, and honestly never thought about it.
I didn't even realize anyone was questioning the 2D image quality of modern cards over a digital link...
Seems like the answer to a question no one asked? But some myths are hard to bust, to this day people are convinced that AMD/ATI drivers are on the same level they were in 12 years ago, but when it comes down to it, toss a coin there is no much to chose from.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I didn't forget anything. I am not even sure what is being described there is accurate or not. And it doesn't matter because from what I've seen, it is impossible to discern any difference between the two vendors, and I've seen both driving some expensive, high end displays. I personally have a fairly expensive plasma and I've driven it with both an Nvidia and AMD card, I didn't see any image fidelity differences at all, and honestly never thought about it.

Seems like the answer to a question no one asked? But some myths are hard to bust, to this day people are convinced that AMD/ATI drivers are on the same level they were in 12 years ago, but when it comes down to it, toss a coin there is no much to chose from.
Unless u consider xfire support of course ;)
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Analog quality is going to be lousy when the DACs are integrated into the GPU. Some VGA CRTs potentially are better than even the best digital output, but the analog source's circuitry and DAC quality is too big of a factor for analog to still be used.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,600
1
81
I have two computers both hooked up to the same model of monitor. My PC uses a 5850 and the other uses a GTX 550 Ti, there is no difference in 2D quality so this result does not suprise me at all.