Why do Fermi cards have bad video quality?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
So: AMD wins for torrented video playback, while nvidia wins for gaming (unless you turn off AMD's over-optimizing to improve FPS)?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So: AMD wins for torrented video playback, while nvidia wins for gaming (unless you turn off AMD's over-optimizing to improve FPS)?

AMD reverted back to the same image great quality after Cats 10.12 IIRC. Therefore, their High Quality default setting is no longer a problem.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Video playback should look identical no matter what card you're using unless you're doing some kind of post-processing of the video (which I would say is probably a bad idea in most if not all cases).

Also I don't think there's really any thing that Nvidia or ATI does that's special that would separate them from a software based implementation (say ffdshow filters or whatever).

The h264 standard requires that no matter what decoder used the output has to be identical (as opposed to say mpeg2 where there were slight differences depending on the idct, but that is probably unnoticeable in all cases anyway). So even if you're using hardware decoding (dxva/vdpau) the output will be identical (unless there's a bug somewhere of course).

So when you say "Why do Fermi cards have bad video quality?" what you're really saying is why does the video look worse because I'm using post processing effects I really shouldn't be using in the first place.

Edit:
I'm pretty sure Nvidia disables all post-processing effects by default so I really don't understand where the OP's topic comes from.

Hopefully we get more people post, possibly some1 with video cards from both camps like the 460 and 6850 or whatever who has done some encoding/decoding and write his experience with the quality of the videos.

Video card for encoding? No thanks, all implementations suck.
Stick to x264, your cpu is better at integer math anyway.
 
Last edited:

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
On the topic of this thread, all I can say is WTF?

I have an AMD at home and a NVidia at my crib near work and both are excellent. I could care less about arguments about which is better for video. Part of the reason I bought NVidia this time was its Photoshop Cuda support. For video encoding, I prefer using the CPU. Neither GPU maker has got things exactly right in my experience.

At some point, when X68 comes out, I may do a mobo swap so I can use Intel QuickSync. That renders all other arguments moot.

Torrent video streams? Please. If you're using torrents, you're robbing artists and creative workers. If you're too cheap to even use Netflix, I don't know what to tell you. You can't even download a free Linux distro via torrent without problems. Torrents are overrated and the ecosystem is polluted by scumbags.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Video playback should look identical no matter what card you're using unless you're doing some kind of post-processing of the video (which I would say is probably a bad idea in most if not all cases).

Also I don't think there's really any thing that Nvidia or ATI does that's special that would separate them from a software based implementation (say ffdshow filters or whatever).

The h264 standard requires that no matter what decoder used the output has to be identical (as opposed to say mpeg2 where there were slight differences depending on the idct, but that is probably unnoticeable in all cases anyway). So even if you're using hardware decoding (dxva/vdpau) the output will be identical (unless there's a bug somewhere of course).

So when you say "Why do Fermi cards have bad video quality?" what you're really saying is why does the video look subjectively worse because I'm using post processing effects I really shouldn't be using in the first place.

Edit:
I'm pretty sure Nvidia disables all post-processing effects by default so I really don't understand where the OP's topic comes from.



Video card for encoding? No thanks, all implementations suck.
Stick to x264, your cpu is better at integer math anyway.

I'm not sure why that needs to be 'subjective'.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Torrent video streams?

the only person to mention torrented video was someone who used it as a snide remark in a negative connotation. As in, this issue of video doesn't matter since we don't pirate. So actually nobody in this thread said they torrent.
Also there are many non pirated things you can download via torrent, even videos.
torrent =! piracy.
Although often people say it to mean they pirate movies and the like.
 
Last edited:

SRoode

Senior member
Dec 9, 2004
243
0
0
I have an ATI 5850 as well as an nVidia GTX570. The video quality (at least to my eyes) seems identical.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I'm glad the discussion is turning well and is taking the right path.

Hopefully we get more people post, possibly some1 with video cards from both camps like the 460 and 6850 or whatever who has done some encoding/decoding and write his experience with the quality of the videos.

I've also noticed another article on anandtech about the GTX 460 in which they say it has somewhat worse video playback quality, as it tends to blur some small parts of the image.

SlickR12345, would you please address SHAQ's post for the benefit of the readers of your thread? Thanks in advance :)

Title of the thread could use a change. It says it has "bad" quality and the article states most people won't notice. Only by some obscure measures and having both in front of you are you likely to notice. I don't see threads attacking TN panels because they are inferior to IPS panels. Why is that?
 

waffleironhead

Diamond Member
Aug 10, 2005
7,046
549
136
Title of the thread could use a change. It says it has "bad" quality and the article states most people won't notice. Only by some obscure measures and having both in front of you are you likely to notice. I don't see threads attacking TN panels because they are inferior to IPS panels. Why is that?

I remember a few months ago nvidia and their crew were complaining about amd's video quality being worse, despite most sites claiming that aside from a benchmark most people wouldn't notice. This reminds me of about the same.

Bad is subjective. All it takes is being worse than the competition.

Maybe someday nvidia will step up the game and enable all of the capabilities that amd cards are capable of now. I applaud articles like this that point out weaknesses in technology, it forces the companies to take notice and hopefully change for the better.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Maybe someday nvidia will step up the game and enable all of the capabilities that amd cards are capable of now. I applaud articles like this that point out weaknesses in technology, it forces the companies to take notice and hopefully change for the better.

So enabling filters by default that I don't need or want is stepping up the game?

I think you overestimate what the filters actually do.
 

darckhart

Senior member
Jul 6, 2004
517
2
81
i agree with TheRyuu.

I have owned multiple ati and nv cards since 2000. there is no such thing as uncompressed video unless you have the source and it came from something so ridiculously high res that the common man couldn't afford it. every video file conforms to some standard which states its compression specs.

playback of compressed video is part of a long chain. the only thing your video cards should be doing is utilizing dxva and that's it. everything else about the video is controlled via something else (e.g., tv settings, playback software, filters, etc). there is no video quality difference between playback from the cards. if you tweak those settings and see differences, that is a direct result of what you did.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,046
549
136
So enabling filters by default that I don't need or want is stepping up the game?

I think you overestimate what the filters actually do.

No, its not the enabling the filters by default. Its the fact that the nvida cards, even when enabled dont affect anything.

"The GeForce cards don’t have a dedicated option to reduce compression artifacts, such as mosquito noise reduction or de-blocking, and the noise reduction setting in the Nvidia driver doesn’t appear to affect anything but grainy noise. Because we can’t see any difference with noise reduction on or off, we’re forced to give the GeForce cards a zero score in this test."

To me that looks to be something that nvidia is lacking.

Im not overestimating what the filters on nvidia cards do, according to the test run, they didnt do anything.
 

amenx

Diamond Member
Dec 17, 2004
4,406
2,727
136
If I was a fan boy of one or the other card makers, thats probably the sort of title I would choose. :D

Vid quality and gaming quality are of paramount importance for me. If I though for a moment that I was not getting as good as I should from my NV card, I would dump it faster than a hot potato. Not that I disagree with the premise of the article, there may indeed be something to it, but I do not believe it amounts to more than anything of infinitesimal significance. Like those German articles pointing out ATI had lowered IQ settings in their drivers to give an FPS advantage to their new 68xx cards vs Nvidia.