• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

A Question about Heterogenous Computing?

Yes. It's not about resolution, it's about encoding artifacts.

That said, there is nothing about GPUs that makes them inferior for encoding. It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.
 
In some encoding reviews...

It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.

Yep...you get what you pay for. TMPGEnc uses cuda for transcoding and does just fine, but you pay a pretty penny for the priveledge of using that software.

Anything that involves lossy compression, as is the case with encoding/transcoding, is going to have an image quality versus processing speed tradeoff.

What always blew me away is that the early gpu-assisted transcoders were just sooooo damn crappy, they were absolutely useless to any potential customer and basically gave the entire industry a bad name.

I can't fathom why they bothered to release known crappy transcoding software given that the only possible outcome was the one that transpired. Intel marketing couldn't have planned it better and they didn't even have to put in a dime to make any of the bad press happen.
 
Yep...you get what you pay for. TMPGEnc uses cuda for transcoding and does just fine, but you pay a pretty penny for the priveledge of using that software.

Anything that involves lossy compression, as is the case with encoding/transcoding, is going to have an image quality versus processing speed tradeoff.

What always blew me away is that the early gpu-assisted transcoders were just sooooo damn crappy, they were absolutely useless to any potential customer and basically gave the entire industry a bad name.

I can't fathom why they bothered to release known crappy transcoding software given that the only possible outcome was the one that transpired. Intel marketing couldn't have planned it better and they didn't even have to put in a dime to make any of the bad press happen.
You have to go back even farther than that. ATI's AVIVO wasn't even GPU accelerated at the beginning, it was merely locked to certain GPUs and encoded everything in software at the lowest possible (read: fastest) settings. Of course ATI wouldn't initially admit this...
 
Yes. It's not about resolution, it's about encoding artifacts.

That said, there is nothing about GPUs that makes them inferior for encoding. It's just a matter of the quality of the encoding software. I don't know if it will get better unless people start paying for it.

Thanks, I forgot about Artifacts. I don't encode myself and was thinking only about the blurring and softening of edges I have read about in some reviews.

In this Anandtech review, "artifacts", "color distortion" and "softening of edges" are all mentioned. (Note: With Nvidia hardware the images tested were deemed "clean".)
 
Back
Top