Video playback benchmark - Radeon Vs Geforce Vs Quicksync

Baby octopus

Junior Member
Oct 17, 2012
4
0
0
Hi,

Has anyone done any benchmarking on video decoding capabilities of GPUs such as Radeon, Geforce Vs Intels Quciksync technology? Is there any such data available either on AnandTech or elsewhere. Any link will be greatly appreciated :)

Regards,
Baby Octopus
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
x86 > QuickSync > Radeon > Geforce

Transcoding right?


As far as video playback goes, it doesn't matter - QuickSync isn't a video playback feature so I just assumed you were talking about transcoding.
 

Baby octopus

Junior Member
Oct 17, 2012
4
0
0
Hi,

I'm not talking about transcoding. I'm talking of Video decode only capabilities. Just want to get a feel of how many channels of HD bluray content(3B pictures, say 20Mbps) decode I would be able to do on each of these devices. I assume I should be able to do atleast 2 channels on these graphics cards, but probably more than 4 using Intel's ASIC assisted decode on Sandybridge/IvyBridge platforms

Regards,
Baby Octopus
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
It is not easy to "benchmark" video playback, because there is no single parameter to determine ranks. If you find such a review, there will be inevitably tester's subjective opinions mingled together with the data. Plus, video cards are so powerful these days (even cell phones can output 1080p) thus a trouble usually stems from drivers, codecs, source material, and player, etc., not the hardware.

There are, of course, those basic cards that do not have adequate horse power for modern material (especially if you have sensitive eyes). Often times you can fall back on the CPU in such situations unless drivers or players are being stubborn.

AMD and NV are at feature-parity for the moment. (Can't speak for Quicksync because I haven't bothered to use it) Once you go past those laundry list of feature sets, NV cards usually do better handling something newer or less known (read: torrents, foreign stuff,.. and porn, maybe?), often at the cost of higher power consumption - those tends to rely on GPU shaders instead of video processing unit on Geforce. Kepler, however, did shaken things up significantly, in part with better video processing engine and in part with its all around modest power profile.

Trying to guage the performance by CPU utilization is difficult because modern CPUs have robust power-saving mechanism that changes various clock domains in milliseconds. Total power consumption might be one way to measure performance differential but as long as you test with "legal" material, the difference between similar-grade cards will be so minimal that there isn't really a point.

I personally prefer AMD's native color space and its out-of-the-box post processing, but NV is definitely more versatile as of now, thanks to Kepler. NV had been usually a step behind in this area, or used to do things in brute-force ways. (Fermi was the biggest offender) But "now" is what counts anyway, and Kepler's video processing engine gets the nod for its versatility and low power consumption.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well, if the video plays smoothly then that's all there is to measure. If there are major sync issues or something then we could conclude something. As it is, all of the modern CPUs and GPUs can playback Blu-Ray perfectly fine.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Hi,

I'm not talking about transcoding. I'm talking of Video decode only capabilities. Just want to get a feel of how many channels of HD bluray content(3B pictures, say 20Mbps) decode I would be able to do on each of these devices. I assume I should be able to do atleast 2 channels on these graphics cards, but probably more than 4 using Intel's ASIC assisted decode on Sandybridge/IvyBridge platforms

Regards,
Baby Octopus

If i recall right, Intels can decode 4 streams at once, nVidia and AMD 2. Just enough to be BD-Live compliant.