DaveSimmons
Elite Member
- Aug 12, 2001
- 40,730
- 670
- 126
So: AMD wins for torrented video playback, while nvidia wins for gaming (unless you turn off AMD's over-optimizing to improve FPS)?
So: AMD wins for torrented video playback, while nvidia wins for gaming (unless you turn off AMD's over-optimizing to improve FPS)?
Hopefully we get more people post, possibly some1 with video cards from both camps like the 460 and 6850 or whatever who has done some encoding/decoding and write his experience with the quality of the videos.
Video playback should look identical no matter what card you're using unless you're doing some kind of post-processing of the video (which I would say is probably a bad idea in most if not all cases).
Also I don't think there's really any thing that Nvidia or ATI does that's special that would separate them from a software based implementation (say ffdshow filters or whatever).
The h264 standard requires that no matter what decoder used the output has to be identical (as opposed to say mpeg2 where there were slight differences depending on the idct, but that is probably unnoticeable in all cases anyway). So even if you're using hardware decoding (dxva/vdpau) the output will be identical (unless there's a bug somewhere of course).
So when you say "Why do Fermi cards have bad video quality?" what you're really saying is why does the video look subjectively worse because I'm using post processing effects I really shouldn't be using in the first place.
Edit:
I'm pretty sure Nvidia disables all post-processing effects by default so I really don't understand where the OP's topic comes from.
Video card for encoding? No thanks, all implementations suck.
Stick to x264, your cpu is better at integer math anyway.
Torrent video streams?
I'm not sure why that needs to be 'subjective'.
I'm glad the discussion is turning well and is taking the right path.
Hopefully we get more people post, possibly some1 with video cards from both camps like the 460 and 6850 or whatever who has done some encoding/decoding and write his experience with the quality of the videos.
I've also noticed another article on anandtech about the GTX 460 in which they say it has somewhat worse video playback quality, as it tends to blur some small parts of the image.
Title of the thread could use a change. It says it has "bad" quality and the article states most people won't notice. Only by some obscure measures and having both in front of you are you likely to notice. I don't see threads attacking TN panels because they are inferior to IPS panels. Why is that?
Title of the thread could use a change. It says it has "bad" quality and the article states most people won't notice. Only by some obscure measures and having both in front of you are you likely to notice. I don't see threads attacking TN panels because they are inferior to IPS panels. Why is that?
Maybe someday nvidia will step up the game and enable all of the capabilities that amd cards are capable of now. I applaud articles like this that point out weaknesses in technology, it forces the companies to take notice and hopefully change for the better.
So enabling filters by default that I don't need or want is stepping up the game?
I think you overestimate what the filters actually do.