You do not seem to understand how compression works very well
LOL I'm a programmer, I've implemented common compression algorithms. Do you even know what a Huffman tree or DCT quantization matrix is?
Only point I was making is that for anyone to have that large a quantity of data it can already be assumed that the majority of the data is media files. Nothing else requires that kind of space. Anything over a couple hundred MB for the majority of people can be assumed to be media files which don't compress well or at all.
Transcoding those files to lower bit rates is not compression. It's transcoding and lowering bit rates. I can get a 2 MB Word document to fit on a 1.44MB floppy if I delete every 4th line too.
Ironically .docx, .xlsx, etc files are zip file containers so you could make the case that simple documents can't even be compressed anymore either because they already are. Likewise PDF uses compression, or already contains lots of JPEG encoded data and images, program installation packages are decently compressed, partition backups have varying levels of compression, MP3, JPG, DIVX, WMV, AVI, etc all go without saying.
Half the time I zip/rar anything anymore is to consolidate thousands of small files into a single large file for storage device performance. I don't need to rar my partition backups when I've already selected the best slowest method in Acronis, to give just one example.