2D Video Quality Champ - ATI or nVidia?

cheez

Golden Member
Nov 19, 2010
1,722
69
91
Hello people,

If you remember a long way back there were quite a few reviews on ATI and nVidia cards for 2D video quality and the ATI always came on top. How is it now days?? Have nVidia cards improved in this area? How does it compare to the ATI? Are there any good reviews for this kind of test? I've been out of computer hardware stuff for several years so I'm way behind... My current video card is ATi X1950XTX. I prefer the high-end card as I'm planning to swap out my old card in my HTPC. How is GeForce GTX 460/470? It looks like a monster 3D performer and great bang for buck but how is the 2D??? Oh and I do game too.

Thanks:D
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Matrox has traditionally been the king of 2D.

Now, with the advent of digital displays, I do not think it matters as much...

But, I certainly am open to someone proving me wrong!
 

Emulex

Diamond Member
Jan 28, 2001
9,759
1
71
chrome 9 and xbmc use the power but honestly at 2560x1600 the geforce 220 or HD5450 ($9/ar) both rock the htpc about 4% across all 4 cpu's (at 6x min cpu multiplier) and about 5-8% video cpu (if accurate) with the ATI having full bitstream and the geforce not beng an audio card at all.

but i rock an old 6.1 sony so i'm good with toslink

I'm curious if anyone has implemented hdcp decrypting on the fly so you can reroute through toslink unencrytped
 

cheez

Golden Member
Nov 19, 2010
1,722
69
91
ATI

"When comparing ATI against NVIDIA, we see higher scores across the board on ATI and more elaborate options in the driver settings."

http://www.techpowerup.com/reviews/HQV/HQV_2.0/8.html

Also see http://www.anandtech.com/show/3973/nvidias-geforce-gt-430/4
Wow great info. Thanks! Dang my gut feeling was right. I had sworn that with similar video tweak settings in the driver my X1950XTX did seem to display video (video rendering, clarity / cleanness, etc) a bit better than the lower end ATI card I have. Good to know that there is difference of qualities depending on the model you get with the same brand even. I wonder if ATi radeon HD6870 has any better PQ over the HD5870 hmm.. This is interesting.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Guru3D did a good article on video image optimizing on both ATI and nVidia using Media Player Classic Home Cinema.

I really doubt you are going to notice much difference between the two, though. Both are fine.

http://www.guru3d.com/article/accelerate-x264-1080p-movies-over-the-gpu-guide/

Thanks for the link about setting up MPC-HC, but a) that's an old article (HD4xxx series; we're already at HD6xxx series now) and b) the article was more about comparing integrated graphics and not discrete cards (the OP is asking about cards, not IGPs), and c) there was no objective IQ comparison data, no HQV scores, either.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I smell damage control incoming.

My next laptop is in the mail and it has a GT 445m. I wonder if I'll notice the difference. :hmm:
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Thanks for the link about setting up MPC-HC, but a) that's an old article (HD4xxx series; we're already at HD6xxx series now) and b) the article was more about comparing integrated graphics and not discrete cards (the OP is asking about cards, not IGPs), and c) there was no objective IQ comparison data, no HQV scores, either.
Well, the link you gave simply gave a subjective score without any image shots to back up the scores. It just showed a results graph that gave the appearance of ATI being better without actually showing what was better. I'd hardly say that makes ATI better than nVidia for viewing 2D movies. That is mostly going to depend on the monitor/tv and the software used with modern graphics cards these days as they are all (both AMD and nVidia) deliver good quality.

That article used a discrete ATI card, btw. Hilbert simply listed a few IGP's as they are capable of good video playback, just not quite a good when you start using more advanced functions. It may be older, but nothing has changed which is why it hasn't been updated.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Well, the link you gave simply gave a subjective score without any image shots to back up the scores. It just showed a results graph that gave the appearance of ATI being better without actually showing what was better. I'd hardly say that makes ATI better than nVidia for viewing 2D movies. That is mostly going to depend on the monitor/tv and the software used with modern graphics cards these days as they are all (both AMD and nVidia) deliver good quality.

That article used a discrete ATI card, btw. Hilbert simply listed a few IGP's as they are capable of good video playback, just not quite a good when you start using more advanced functions. It may be older, but nothing has changed which is why it hasn't been updated.

Did you just call HQV tests subjective? Seriously? What does that make the GURU3D article then, even-more-subjective? A no-show, since it didn't even try to compare IQ in an objective manner?

The GURU3D article was mostly written as a how-to guide for MPC-HC. But when it came time to compare GPUs, it was mostly comparing IGPs, with a discrete card thrown in as a point of reference. And all of those GPUs were old. The OP here is not trying to optimize MPC-HC or to compare old IGPs against each other; he wants to know which of the latest discrete cards have better 2D video playback. Totally different questions!

I agree that TV/monitor and software could affect IQ as well, but I think the appropriate phrase is "ceteris paribus." That is, on the exact same screen and software, which video cards tend to do better in 2D video playback?

Let me put it this way: imagine we are discussing audio instead, and he is talking about speaker quality. Imagine AMD and NV make sound cards. You are saying that maybe the receiver, AV cable, speakers, etc. have a bigger impact on overall audio experience than the sound cards do. That may be true, but that doesn't address OP's question, which I have interpreted as: holding ALL ELSE CONSTANT, which sound cards are better, the AMD ones or NV ones? (I may be misinterpreting his question, but I think the assumption implicit in his question is that you hold all else constant.)

For people with insensitive eyeballs (for lack of a better term... you know what I mean, I think), the overall viewing experience might not be that badly affected even if one video card has worse video playback than the other; in fact, a sufficiently insensitive eyeball would see no difference in quality. But even if that is true, it doesn't change the fact that NV has inferior 2D video playback right now. It's subjective just how inferior they are... maybe it's only slightly inferior so OP would never notice the difference anyway, just like how I (with my insensitive ears) can't really tell the difference between high-end and ultra-high-end audio.

But I know some people are very touchy about trying to get the best. That's why some people shell out small fortunes on A/V cables that give just a little bit better performance, for instance. Or even bigger fortunes on slightly better speakers, etc.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Over a DVI connection, shouldn't 2d output be identical?

Imagine a really crappy sound card vs. a really good one. Same audio cable and interface standard. Same speakers and everything else, for that matter. The results will not be the same, though.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Wow great info. Thanks! Dang my gut feeling was right. I had sworn that with similar video tweak settings in the driver my X1950XTX did seem to display video (video rendering, clarity / cleanness, etc) a bit better than the lower end ATI card I have. Good to know that there is difference of qualities depending on the model you get with the same brand even. I wonder if ATi radeon HD6870 has any better PQ over the HD5870 hmm.. This is interesting.

Probably. The HD6xxx series can apparently score at LEAST 204: http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-21.html

Compare this to the TPU results: HD5870 out of the box 117, defaults 177, optimized 197. GTX out of the box 101, defaults 163, optimized 180. www.techpowerup.com/reviews/HQV/HQV_2.0/8.html

The GTX460 is apparently on par with a GTX480 in image quality: http://www.kitguru.net/components/graphic-cards/zardon/evga-gtx-460-768mb-superclocked-review/14/ (sounds like they were using newer drivers that put the default IQ of the GTX460 closer to its maximum)

Also see: 148 (GT 430) and 189 (HD5570) in the anandtech comparison I linked to before: http://www.anandtech.com/show/3973/nvidias-geforce-gt-430/4 But I think those scores were unoptimized so they could be higher with some tweaking.

So the pecking order is:

204 (6870 somewhat optimized; sounds like Tom's didn't have a chance to play with settings to determine maximum HQV 2.0 score)
197 (5870 optimized)
193 (5830 probably nearly optimized with new drivers)
189 (5570 default)

180 (GTX480 optimized)
177 (GTX460 nearly optimized with new drivers)
148 (GT 430 default)


If I were you and planned to watch video more than play games on an HTPC, I would get a HD6850 or HD5750/5770 (depending on how much gaming power you need) and call it a day. In gaming, the HD6850 is on par with a GTX460-1GB, but the primary use of a HTPC card is for video playback, and HD6870/50 apparently excels at video playback, with HD5xxx close behind (get at least a 5570 though because any lower and HQV scores start dropping and cheese slice tests get worse), and GTX460 behind both of them. (I am assuming a GTX460 can't do any better than an optimized GTX480's score of 180.)
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Im just gonna say that guide to how to setup Media player classic was nice (I started playing around with the shader filters ect), though I dont think much importance of 2d quality issues (between cards), the stuff about how to setup the media player classic did visually improve how my movies looked.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Im just gonna say that guide to how to setup Media player classic was nice (I started playing around with the shader filters ect), though I dont think much importance of 2d quality issues (between cards), the stuff about how to setup the media player classic did visually improve how my movies looked.

I recommend hitting Ctrl+P in various scenes (to toggle pixel shader on/off) and see which look you prefer. For me the sharpening and blacker blacks made my test movie (Avatar) look worse, in combination and separately. There are harsher transitions in shadows and you lose details. E.g., near the beginning of Avatar, the outer space nebula almost disappear with the blacker blacks, and the planet looks more artificial when sharpened and not allowed to have the diffuse clouds and stuff on the surface. Different strokes for different folks.
 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
It's a totally irrelevant discussion if you're outputting to a digital display via DVI-D or HDMI. (Edit: or DisplayPort.)

I doubt that either co. is spending much money per card on analog filtering quality. CRTs are dinosaurs, and digital flat panels don't care.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Imagine a really crappy sound card vs. a really good one. Same audio cable and interface standard. Same speakers and everything else, for that matter. The results will not be the same, though.

With digital audio output, there shouldn't be any difference either, as long as it's always digital (and they support the same settings). They may have a difference in how they filter video playback, but they should be pretty much identical for generic 2d stuff.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Matrox has traditionally been the king of 2D.

Now, with the advent of digital displays, I do not think it matters as much...

But, I certainly am open to someone proving me wrong!
The voodoo3 and 5 also had excellent output quality, although not quite as good as Matrox.

Or it may just be that 3dfx's reference rasterizer looked so damn good which has nothing to do with the DAC output quality.

In any event, you never want an integrated DAC, but fortunately, digital has taken over, so they can just put the transmitter in the processing chip.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
With digital audio output, there shouldn't be any difference either, as long as it's always digital (and they support the same settings). They may have a difference in how they filter video playback, but they should be pretty much identical for generic 2d stuff.

It's a totally irrelevant discussion if you're outputting to a digital display via DVI-D or HDMI.

I doubt that either co. is spending much money per card on analog filtering quality. CRTs are dinosaurs, and digital flat panels don't care.

"Inside PCs, all the magic happens with graphics processors. There is a long standing myth that since analog media (such as tapes) were replaced by digital formats, all lossless video consumed from DVDs, Blu-ray discs, and lossless video files end up being of the same quality once it reaches the screen. Of course one can argue that image quality varies among displays, but another component that contributes to image quality is the graphics card.

Even though the graphics card connects to your display over a digital connection, it is the GPU, which can enhance the fidelity of the images its sending to the display, in addition to the original lossless recording. Lossless aside, with rapid advancements in Internet bandwidth, the Internet is serving as the best medium for content distribution. Videos rented, sold, or exhibited on the web come in lossy formats to reduce bandwidth consumption (and transfer time). This increases the importance of GPU to give out the best image quality of the encoded content, and enhance its quality artificially. With limited bandwith at HD resolution you will see the introduction of compression artifacts, or upscaling artifacts at low resolution content on HD displays." --TechPowerUp

I could be misinterpreting TPU's review, but if I am not, I will refer you both to TPU's explanation of what HQV 2.0 is testing for. It's also relevant for stuff like fixing Hulu clips and such--things an HTPC will probably spend lots of time doing.

I already linked to TPU's review earlier in this thread, but for convenience, I will link to it again, starting on page 1 this time instead of the conclusion:

http://www.techpowerup.com/reviews/HQV/HQV_2.0/1.html

If that's not enough, try HQV's own website (note they have both SD and HD versions of their software): http://www.hqv.com/index.cfm?page=benchmark

I think I needlessly introduced confusion by comparing the video situation to an analog audio situation; sorry about that. I should have used another example.
 
Last edited:

cheez

Golden Member
Nov 19, 2010
1,722
69
91
I smell damage control incoming.

My next laptop is in the mail and it has a GT 445m. I wonder if I'll notice the difference. :hmm:
In small screens like on laptops you may not notice much of PQ differences but everybody is different with their eyes and what to expect... I'm extremely picky with PQ in 2D so.. depends on the personal preference I suppose. If I were to buy me a new laptop I would still lean toward getting one with an ATi card equipped.


The GURU3D article was mostly written as a how-to guide for MPC-HC. But when it came time to compare GPUs, it was mostly comparing IGPs, with a discrete card thrown in as a point of reference. And all of those GPUs were old. The OP here is not trying to optimize MPC-HC or to compare old IGPs against each other; he wants to know which of the latest discrete cards have better 2D video playback. Totally different questions!
Most excellent post sir!:D Yes, that's exactly my point. I've got my own superb software (better than MPC) to take care of playback part. It is the video card that I'm trying to pick out that has the edge in 2D.



Let me put it this way: imagine we are discussing audio instead, and he is talking about speaker quality. Imagine AMD and NV make sound cards. You are saying that maybe the receiver, AV cable, speakers, etc. have a bigger impact on overall audio experience than the sound cards do. That may be true, but that doesn't address OP's question, which I have interpreted as: holding ALL ELSE CONSTANT, which sound cards are better, the AMD ones or NV ones? (I may be misinterpreting his question, but I think the assumption implicit in his question is that you hold all else constant.)
You interpreted my question perfectly. Thanks.:)

For people with insensitive eyeballs (for lack of a better term... you know what I mean, I think), the overall viewing experience might not be that badly affected even if one video card has worse video playback than the other; in fact, a sufficiently insensitive eyeball would see no difference in quality.
I have very sensitive eyeballs.:D

But I know some people are very touchy about trying to get the best.
You are so on point that I have nothing much else to say.:awe: Yes, any PQ edge counts and that is what I'm after.


So what should I get... ATi Radeon HD5870 or HD6870?:)
 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
A codec is a codec. It's hardware independent. It reads in data X, and outputs data Y, or vice versa. A given graphics chip maker may write optimizations into their drivers to "improve" the output of a codec, but it'll usually be under certain conditions, such as when the graphics driver detects that it's decoding something popular such as MPEG-2 data.

If you wish to know which card maker's drivers do a better job of "optimizing" the output of codec X, you'll have to look for relevant reviews.

I own a brutally analytical display panel, so almost all MPEG-2 still looks crappy to me. :p This is also a factor in the equation. The better I calibrate the display, the worse that low-quality images tend to appear on it.

That's the media's fault. Not my display system's fault.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
NVIDIA does some things well here while stumbling in other areas. So let’s start where they stumble: image quality. In the quest for the more perfect HTPC card we found the Radeon HD 5570 earlier this year, and it was good. In more recent times AMD has put a lot of effort in to stepping up their game on image quality with the release of the HQV 2 benchmark and it shows in our results.

We always hate to rely so much on a single benchmark, but at this point HQV 2 provides the best tests we can get our hands on, so we can’t ignore the results. Certainly the GT 430 is a step up from the likes of Intel’s GMA, however the Radeon 5570 has an even bigger advantage over the GT 430. If image quality absolutely matters to you, then the Radeon 5570 is definitely the card to get for the time being until NVIDIA can spend more time on improving the video capabilities of their drivers.

http://www.anandtech.com/show/3973/nvidias-geforce-gt-430/18
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Ti Radeon HD5870 or HD6870?
@Cheez,

if you need to upg... and your worried about IQ, then the 6870 is probably a better bet, it has a few improvements over the 5xxx series in that department. it is a slower card than a 5870 though.

I own a 5870 and I like the card, I dont own a 4xx to compair it to... Ive never really though IQ to be a issue for picking either nvidia or ati cards.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
A codec is a codec. It's hardware independent.

If you wish to know which card maker's drivers do a better job of "optimizing" the output of codec X, you'll have to look for relevant reviews.

Interesting, but HQV scores are clearly different even within the same family of hardware, let alone between different companies' hardware. See for instance the low score of GT 430 vs. the higher scores of GTX 460 and GTX 480. And the GT 430 was using newer drivers, too.

You can also optimize for HQV somewhat (see TPU's review).

This implies that a combination of hardware and software go into an HQV score.

Speaking of HQV, what is your interpretation of AT, TPU, Tom's and KitGuru's testing then? Because they show different levels of performance on the HQV 2.0 test. Are you saying it was pointless to use HQV 2.0, or that HQV 2.0 is flawed, and that an IGP will do as good of a job as an HD6870 when it comes to video playback?

And why would reputable sites like AT use HQV 2.0 to judge HTPC performance, if HQV were pointless or flawed?
 

cheez

Golden Member
Nov 19, 2010
1,722
69
91
Over a DVI connection, shouldn't 2d output be identical?
No apparently they are not the same. I use DVI-D running my professional grade 50" plasma displays. With same plasma screen with same connection type and same tweaking and same software I noticed the difference in PQ.



Probably. The HD6xxx series can apparently score at LEAST 204: http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-21.html

Compare this to the TPU results: HD5870 out of the box 117, defaults 177, optimized 197. GTX out of the box 101, defaults 163, optimized 180. www.techpowerup.com/reviews/HQV/HQV_2.0/8.html

The GTX460 is apparently on par with a GTX480 in image quality: http://www.kitguru.net/components/graphic-cards/zardon/evga-gtx-460-768mb-superclocked-review/14/ (sounds like they were using newer drivers that put the default IQ of the GTX460 closer to its maximum)

Also see: 148 (GT 430) and 189 (HD5570) in the anandtech comparison I linked to before: http://www.anandtech.com/show/3973/nvidias-geforce-gt-430/4 But I think those scores were unoptimized so they could be higher with some tweaking.

So the pecking order is:

204 (6870 somewhat optimized; sounds like Tom's didn't have a chance to play with settings to determine maximum HQV 2.0 score)
197 (5870 optimized)
193 (5830 probably nearly optimized with new drivers)
189 (5570 default)

180 (GTX480 optimized)
177 (GTX460 nearly optimized with new drivers)
148 (GT 430 default)


If I were you and planned to watch video more than play games on an HTPC, I would get a HD6850 or HD5750/5770 (depending on how much gaming power you need) and call it a day. In gaming, the HD6850 is on par with a GTX460-1GB, but the primary use of a HTPC card is for video playback, and HD6870/50 apparently excels at video playback, with HD5xxx close behind (get at least a 5570 though because any lower and HQV scores start dropping and cheese slice tests get worse), and GTX460 behind both of them. (I am assuming a GTX460 can't do any better than an optimized GTX480's score of 180.)
Thank you for more links! That was very helpful. Looks like I got the info I needed. I'm gonna try to go with the highest line possible based on my budget (max of $400~ $450), this is also to help keep up-to-date with 3D performance as I'm not planning to upgrade video till another 4 years or so. My X1950XTX is around 4 years old now, been doing fantastic job. I use it for video playback 80% of time and 20% for gaming. I think I'm gonna try to pick up an HD6870.:)




The voodoo3 and 5 also had excellent output quality, although not quite as good as Matrox.

Or it may just be that 3dfx's reference rasterizer looked so damn good which has nothing to do with the DAC output quality.

In any event, you never want an integrated DAC, but fortunately, digital has taken over, so they can just put the transmitter in the processing chip.
Yo I have been through the good'ol voodoo and TNT days and I remember the voodoo didn't do good job in 2D. Yes, Matrox was good back then.