Is a 4k bitrate divx'ed down to 100MB the same as a 5k bitrate divx'ed down to 100MB?

Feb 24, 2001
14,513
4
81
So recording some TV shows, and want to archive them. I haven't been able to tell a difference, but just curious if it matters. Is recording at 4k and making a 100MB file the same as recording at 5k and making a 100MB file? It doesn't seem like it would be. It seems like the 5k one would have more compression artifacts/pixelation/stuttering.

Seems like it would be better to record at a lower bitrate and encode, than higher and encode, if you are wanting to end up with the same file size.

No?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BrunoPuntzJones
So recording some TV shows, and want to archive them. I haven't been able to tell a difference, but just curious if it matters. Is recording at 4k and making a 100MB file the same as recording at 5k and making a 100MB file? It doesn't seem like it would be. It seems like the 5k one would have more compression artifacts/pixelation/stuttering.

Seems like it would be better to record at a lower bitrate and encode, than higher and encode, if you are wanting to end up with the same file size.

No?

No. The lower the input quality, the lower the output quality (to a point; eventually you'll reach the point where the encoder is essentially lossless). The better the original recording, the better the encoder's output will likely be -- it has more information to work with. Of course, after a certain point it's also useless, as your encoder will just be throwing most of the information away (since you're asking for a very aggressive compression).