A Question about Heterogenous Computing?

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Yes. It's not about resolution, it's about encoding artifacts.

That said, there is nothing about GPUs that makes them inferior for encoding. It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
In some encoding reviews...

It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.

Yep...you get what you pay for. TMPGEnc uses cuda for transcoding and does just fine, but you pay a pretty penny for the priveledge of using that software.

Anything that involves lossy compression, as is the case with encoding/transcoding, is going to have an image quality versus processing speed tradeoff.

What always blew me away is that the early gpu-assisted transcoders were just sooooo damn crappy, they were absolutely useless to any potential customer and basically gave the entire industry a bad name.

I can't fathom why they bothered to release known crappy transcoding software given that the only possible outcome was the one that transpired. Intel marketing couldn't have planned it better and they didn't even have to put in a dime to make any of the bad press happen.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Yep...you get what you pay for. TMPGEnc uses cuda for transcoding and does just fine, but you pay a pretty penny for the priveledge of using that software.

Anything that involves lossy compression, as is the case with encoding/transcoding, is going to have an image quality versus processing speed tradeoff.

What always blew me away is that the early gpu-assisted transcoders were just sooooo damn crappy, they were absolutely useless to any potential customer and basically gave the entire industry a bad name.

I can't fathom why they bothered to release known crappy transcoding software given that the only possible outcome was the one that transpired. Intel marketing couldn't have planned it better and they didn't even have to put in a dime to make any of the bad press happen.
You have to go back even farther than that. ATI's AVIVO wasn't even GPU accelerated at the beginning, it was merely locked to certain GPUs and encoded everything in software at the lowest possible (read: fastest) settings. Of course ATI wouldn't initially admit this...
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Yes. It's not about resolution, it's about encoding artifacts.

That said, there is nothing about GPUs that makes them inferior for encoding. It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.

Thanks, I forgot about Artifacts. I don't encode myself and was thinking only about the blurring and softening of edges I have read about in some reviews.

In this Anandtech review, "artifacts", "color distortion" and "softening of edges" are all mentioned. (Note: With Nvidia hardware the images tested were deemed "clean".)