Discussion the Death of the Desktop CPU

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnitaPeterson

Diamond Member
Apr 24, 2001
6,021
547
126
Even with the same codec, resolution, and bitrate, hardware encoding has worse visual quality than software encoding.
Yes, and no. You're right: At a fixed, lower bitrate, hardware encoders have lesser quality than software ones.

These being said, if you raise the bitrate and accept higher size files, the hardware encoders - particularly in the new generation cards! - have improved their quality in leaps and bounds.
 

Doug S

Diamond Member
Feb 8, 2020
3,667
6,487
136
It's not a matter of codec support. Even with the same codec, resolution, and bitrate, hardware encoding has worse visual quality than software encoding.

Well sure, hardware encoders are designed to do the best they can within a limited power budget, software encoders have access to all the CPU cores (and GPU cores depending on how it is written) and thus its power budget is limited only by the overall SoC power budget. It can do ~ an order of magnitude more computation per unit time so of course it will produce better results at a given bit rate.