Quick Sync review, and the results regarding nVIDIA and AMD.

Bearach

Senior member
Dec 11, 2010
312
0
0
Quick Sync, may have put a spanner in the cogs that is CUDA video encoding, and even APP/Stream but I wanted to look at the issue of video quality regarding nVIDIA'a CUDA and AMD's Stream/APP protocols.

I've only ever tried to use Badaboom... And it was terrible, I was into all the hype, finally a way to encode that may be faster but the quality was abysmal.

Looking at the comparisons between them all, Quick Sync is by far the fastest and gives best quality image, but how does nVIDIA and AMD stack up against one another.

According to the Sandy Bridge review, the GeForce GTX460 does well in speed but looking at the screenshots, really is the worst quality of them all in my opinion.

The AMD 6870 is slower, but consistantly gives a a better image quality, look below for the images and tell me what you think?

Also, has Anand got reliable results, have you used both, or either and found them to be comparable to your results?

Casino Royale Transcode :

nVIDIA : http://images.anandtech.com/reviews/cpu/intel/sandybridge/review/quicksync/casinoroyale/gtx460.png
AMD : http://images.anandtech.com/reviews/cpu/intel/sandybridge/review/quicksync/casinoroyale/6870.png

Quantum of Solace Transcode :

nVIDIA : http://images.anandtech.com/reviews...e/review/quicksync/quantumofsolace/gtx460.png
AMD : http://images.anandtech.com/reviews/cpu/intel/sandybridge/review/quicksync/quantumofsolace/6870.png

Dark Knight Transcode :

nVIDIA : http://images.anandtech.com/reviews/cpu/intel/sandybridge/review/quicksync/darkknight/gtx460.png
AMD : http://images.anandtech.com/reviews/cpu/intel/sandybridge/review/quicksync/darkknight/6870.png

No poll or anything, no fanboi flame bait, just a reasonable discussion on these results please.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Casino Royale looks very similar, but high motion scenes are where you generally see the flaws and drawbacks to encoded BR rips.

The 460 encodes for Quantum of Solace and Dark Knight are clearly piss poor and very grainy compared to the 6870. I'm surprised there is that much difference.

And I agree Badaboom is absolute garbage, it doesn't even work on GTX 4XX Fermi cards, which makes it useless anyways.
 

WelshBloke

Lifer
Jan 12, 2005
32,434
10,563
136
I'm pretty :eek: at how well Intel quicksync does in these tests, its faster and better quality than the main players there.

It like the level of hype has been inversely proportional to how good the product is.

Nvidia has been talking up Cuda encoding/transcoding for ages.
AMD has been "me toing" for a while as well.

For Intel to sneak up and steal both their lunches with some GPU accelerated action is a bit :eek:, its fairly :cool: as well.


Edit: Just trying out the Arcsoft converter and it doesn't seem to load my 6950 up at all, its fluctuating between 10% and nothing.
 
Last edited:

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
I have a GTX 460 with PowerDirector 9. Using CUDA is only marginally useful for speed; apply any useful filters and the program goes to the CPU, thus obviating having CUDA in the first place.

Filterless transcodes can be quite fast but very ugly. Screenshots don't fully portray how bad a transcode is, for that you've got to see the video to be gobsmacked at how bad the image quality is. I used the same program in the same computer, but with an ATI 4850 which produced much nicer image quality.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The results are not identical, or at least indistinguishable? *shakes head* Fail.

Clearly, AMD's is the best GPU-centric method, but Intel clearly 'gets it' better. Ironically, though, the QS shot from Dark Knight looks even better than the plain x86, showing that in this case, the original implementation could use some work (the plain x86 is far too aliased).

Still, IMO, until the encoder(s) reach a point the difference is only speed, I'll wait on pure CPU. For any WORM operation, that is near realtime or faster than realtime, quality is a more important metric than speed.

This, of course, begs the question: will Quick Sync start showing up in FOSS encoders?
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
but Intel clearly 'gets it' better.

Except they kinda don't. In their great wisdom Intel launched SB with two chipsets, one that allows the SB graphics (and so Quick Sync) but no OCing, while the other permits overclocking but disables the IGP. So if you want to actually use Quick Sync you have to give up overclocking, and you can't have a discrete GPU in your system from what I gather from reviews.

Intel's logic, I don't get it. o_O
 

SmCaudata

Senior member
Oct 8, 2006
969
1,532
136
Except they kinda don't. In their great wisdom Intel launched SB with two chipsets, one that allows the SB graphics (and so Quick Sync) but no OCing, while the other permits overclocking but disables the IGP. So if you want to actually use Quick Sync you have to give up overclocking, and you can't have a discrete GPU in your system from what I gather from reviews.

Intel's logic, I don't get it. o_O

In itself I guess that's okay. The bizzare part to me is that the overclock enabled chips have the better GPU. Once the Z series lauches I guess it will be okay.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
In itself I guess that's okay. The bizzare part to me is that the overclock enabled chips have the better GPU. Once the Z series lauches I guess it will be okay.

Ya, the better chips that are meant for overclocking which have a faster GPU, see the GPU side entirely negated if used on the P67. It's lousy if you want to utilize your new sandy bridge to all it's potential.

Shame there is no Z series at launch. I'd expect they will be expensive boards given the stance intel has taken on thier mobo lineup.

If I ever was going to use CUDA for transcoding in the future, after reading Anands article on how lousy it does its job I think i'll be looking elsewhere.

Leave it to nVidia to overhype and underdeliver.
 
Last edited: