It really depends. Snow White looked awesome on Blu-ray which is hand drawn animation
I don't know why they wouldn't. However, Bluray videos suck anyway because they're lossy.
I don't know why they wouldn't. However, Bluray videos suck anyway because they're lossy.
Compressed video depends highly on who did the encoding. Some studios , like with Snow White ,will have people examining every frame that comes out of the encoder and tweaking the results, while others will simply stream the content to the encoder and press it to disc without worrying about the results.
I don't know why they wouldn't. However, Bluray videos suck anyway because they're lossy.
I don't know why they wouldn't. However, Bluray videos suck anyway because they're lossy.
Yep. The artifacts bug the hell out of me, but I guess when HDTVs are so shitty it doesn't bother most people. Or since they buy into the 1080p marketing gimmick they don't care about quality. I'd much rather have 800x600p lossless than 1920x1080p lossy. I realize that the detail would be far less, but what good does detail do if you have a lot of compression artifacts?So... every video sucks?
It doesn't take much of a display to see all of the compression artifacts. The detail would be amazing, but only if it was without compression artifacts. Further, I've observed a large difference when I watch a recorded video with FRAPS without RGB lossless checked vs. with it checked. It takes a lot more space, but it's worth every bit (pun intended).So you've done a comparison between the raw uncompressed footage to the Blu-ray version? What display were you viewing this on that you could tell the difference?
It absolutely does. I've bought 1 CD in the short time since the DoJ started cracking down. It's like only the 6th audio CD I've bought in my whole life.Somehow I bet that doesn't keep you from pirating the shit out of them.
So you've done a comparison between the raw uncompressed footage to the Blu-ray version? What display were you viewing this on that you could tell the difference?
Yep. The artifacts bug the hell out of me, but I guess when HDTVs are so shitty it doesn't bother most people. Or since they buy into the 1080p marketing gimmick they don't care about quality. I'd much rather have 800x600p lossless than 1920x1080p lossy. I realize that the detail would be far less, but what good does detail do if you have a lot of compression artifacts?It doesn't take much of a display to see all of the compression artifacts. The detail would be amazing, but only if it was without compression artifacts. Further, I've observed a large difference when I watch a recorded video with FRAPS without RGB lossless checked vs. with it checked. It takes a lot more space, but it's worth every bit (pun intended).
Imagine if GPU IHVs lossily compressed RGBA backbuffer precision (or even if they started using RGB7 front buffers)--the artifacts would be noticeable left and right (especially the more the resolution was) and in the case they used lossy depth buffer compression, long ass depth ranges wouldn't look as good. Even with the best color dithering techniques like 3dfx did, most people would be able to observe the difference. We can already see how bad lossy textures look (some 4:1 S3TC textures look downright awful). Textures would be a lot smaller if they were lossless, but there would be no compression artifacts.
If there is audio I'm not interested in, then I don't really care if it's an MP3 and I probably wouldn't pick up the difference. However, lossy video has outlived its usefulness since we don't need all of the detail of 1080p. 720p lossless video would provide sufficient detail for me.