• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ATI vs. NVIDIA Video Quality

According to this article here at AnandTech NVIDIA is better than ATI in video quality.

Does that mean NVIDIA cards have better quality when playing DVD:s, XviD DivX, etc?

And does that mean NVIDIA has better TV-Out quality?

But it doesn't imply anything about game quality right?
 
Looking at that forum quickly, I assume that it has nothing to do with the games, just the VIVO and PureVideo.

Unless you're playing games via TV-Out... Then there's something wrong with you 😛
 
There is nothing wrong with playing games while sitting on the couch infront of a big TV, but TV out has nothing to do with the information in that article. The article is about deinterlacing things like DVDs and live TV, in which case it seems Nvidia does have slightly better quality. This shouldn't be confused with saying they have better DVD or live TV quality in general though, as deinterlacing is only one of many factors that contribute to overall image quality.
 
Both companies have good IQ, with the nod going to ATi overall. If I were buying and all other things were equal, I'd go ATi...but since all those others things are rarely equal, it usually isn't the determining factor. Just get whichever card has the best price/performance in your budget.
 
why create a fight if you can't even spell nvidia? go search the forums...there are so many threads already.
 
Originally posted by: Rage187
Nowadays both companies have great IQ.


Fanboys just try and make the claim that their company has the better.

I would have to agree with you there, although the FX series IQ wasn't as good as the Radeon 9 series. It wasn't much worse, but it was noticable.
 
Originally posted by: Rock Hydra
Originally posted by: Rage187
Nowadays both companies have great IQ.


Fanboys just try and make the claim that their company has the better.

I would have to agree with you there, although the FX series IQ wasn't as good as the Radeon 9 series. It wasn't much worse, but it was noticable.

QFT

 
Originally posted by: johnnqq
why create a fight if you can't even spell nvidia? go search the forums...there are so many threads already.

Happy now??
And BTW I did search the forum finding nothing relevant.

To the others that replied; thanks for your informative answers. Now I can choose the X800XL over the 6800GT and sleep well.

Edit: Spelling
 
Originally posted by: userofcomputer
According to this article here at AnandTech NVIDIA is better than ATI in video quality.

Does that mean NVIDIA cards have better quality when playing DVD:s, XviD DivX, etc?

And does that mean NVIDIA has better TV-Out quality?

But it doesn't imply anything about game quality right?

That's about deinterlacing video quality, so it's in regards to DVD/tape/TV/etc quality. It has no relevance on games whatsoever. I think XviD/DivX have been previously de-interlaced also (although I could be wrong).

Basically it means that ATI's algorithm for taking an interlaced signal and displaying it on a digital (non-interlaced) screen isn't quite as good as Nvidia's.
 
Originally posted by: Nirach
Unless you're playing games via TV-Out... Then there's something wrong with you 😛


Yes, playing video games on a TV is so strange. 😱
I prefer playing some computer games on TV, especially if you count console emulators. It can be more immersive to have a larger screen I think.
 
Keep in mind PureVideo's been out for awhile. AVIVO was just born. Give it some time, I'm sure in the end both companies will be tied.
 
From what I've read:

DVD/TV Quality = NV better
AA/AF (Game) Quality = ATI Better

However, the differences are pretty minor between the two in both cases, so you probably can't go wrong with either one.

The only thing I'd really look out for, are cards with significantly more RAM than their GPU speed can handle. Both companies makes cards with extra RAM and a slower GPU, and in pretty much every case, you will get worse performance with the extra RAM than if you had spent the money on the next higher GPU with less RAM instead.

-D'oh!
 
Originally posted by: jiffylube1024

That's about deinterlacing video quality, so it's in regards to DVD/tape/TV/etc quality. It has no relevance on games whatsoever. I think XviD/DivX have been previously de-interlaced also (although I could be wrong).

Basically it means that ATI's algorithm for taking an interlaced signal and displaying it on a digital (non-interlaced) screen isn't quite as good as Nvidia's.

Ja, essentially just for SDTV on the PC but there are excellent software filters for that like Dscaler. But really, no one watches such material on their PC display anyway. DVD movies (and transcodes thereof) are progressive sources so the claimed differences are irrelevant.
 
Originally posted by: TheSnowman
DVDs are interlaced, that was the whole point of the Anandtech review linked at the top of this thread.

I think it depends upon the source. Most of them (quality movies) are mostly progressive (90% +) as seen by DVD2AVI and such -even though the content is stored interlaced it can be flagged progressive but I don't know 'nuff aboot it to say how this is interpreted by the decoding hardware and software or indeed if one supercedes the other especially depending upon whether hardware decode is enabled. The article states that ATI's image quality was the same using two different decoders (PureVideo and Intervideo) but is the same true with Nvidia's? It would be interesting to see the result if both were benchmarked with ATI's MMC (Cyberlink decoder), or even Intervideo's. Also, it would seem important to note the quality of TVO as any minor advantage on the PC could seriously be lost by poor TVO.
 
Back
Top