Originally posted by: Patrick Wolf
Ok, I'm not sure what bitrate is. Going from 1500kbps to 800kbps it's really hard to tell the difference in quality, but I think I do see an extremely small difference.
Bitrate is the number of bits of information (ones and zeroes) in the video file for each unit of time (generally a second). In a '1500kbps' file, for instance, each second of video requires 1,500 * 1000 bits (around 190KB).
1500kbps is pretty low to begin with. You're already looking at a highly compressed video (unless the resolution is extremely low). It also depends on what format you are using; MPEG2 or WMV7 degrades much less gracefully than DivX/Xvid or WMV9. Fixed-bitrate will show more degradation (usually) than VBR.
If you are capturing video from a camcorder, etc. try setting it to MPEG2 format, 720x480 (480p DVD resolution) and the bitrate at ~8-9Mbps (8000-9000Kbps). That's about what a good DVD looks like. Then try it at ~4Mbps, then ~1-2Mbps. You should easily see the difference. Of course, if your source material is really bad (or compresses very well, like something that contains very little motion), the differences may not be as pronounced.