Creating Audio CD File Size vs. Minutes

MustISO

Lifer
Oct 9, 1999
11,927
12
81
I want to burn an audio file that is several hours long but only 100MB to an audio CD. I'm confused by the whole file size vs. running time when doing an audio CD.

A CD can hold say 80 minutes or audio or 700MB of data. I can easily fit hours and hours of MP3s to a CD but if I create an audio CD why is it limited to 80 minutes regardless of file size or bit rate?

Even if I reduce the bitrate of the file, the burning program still shows the same running time which I guess makes sense.
 

PhoenixEnigma

Senior member
Aug 6, 2011
229
0
0
CD audio is fixed to a particular, uncompressed format that was defined by the Red Book standard in 1980. It's 16 bit stereo PCM at 44.1khz, and that's what it's going to be if you want it to play as a regular audio CD. Changing the bit rate of the source file has no impact on that.

You can, of course, throw the MP3 file on the CD as a data file, and a good portion of CD players from the past decade or so will probably be able to make sense of that and decode the MP3 file, but it's not really an audio CD then, and isn't nearly as widely playable.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
If you are going to convert whatever file you have to WAV it's going to be limited to 80 minutes due to its fixed size format.

16 bit x 44.1 KHz x 2 channels = 1411.2kbps
 

slashbinslashbash

Golden Member
Feb 29, 2004
1,945
8
81
To perhaps expand a bit on what PhoenixEnigma said:

MP3's are compressed audio. They can be compressed at different rates and qualities. Take a song that is 5 minutes long. It could take up 2MB as a 64kbps MP3, or 10MB at 320kbps. When you play it back, the song will be the same length, despite one file being 5 times the size of the other. You will almost definitely be able to hear the difference between the two MP3s; the smaller one (64kbps) will sound worse than the larger one (320kbps).

A CD does not use compression at all. It is a straight translation of the analog audio signal into a digital signal, at a given sampling rate and bit depth. Any 5:00-minute song on a CD will take up exactly the same amount of space as any other 5:00-minute song on a CD. The sampling rate of a CD is 44.1kHz at 16 bits, meaning that one second of audio consists of 44,100 numbers, each of which is 16 bits in size (i.e., it has 65,536 possible values). 2 seconds of audio would consist of 88,200 numbers, etc.

This is unchangeable and built-in to the CD format. The format was developed in the late 1970's and released in 1980. Remember that audio compression/decompression is a very processing-intensive task (while the simple digital-to-analog conversion process for CD audio is a comparative walk in the park). Even an 80386 computer which was common in the early 1990s could not decode a standard/low-quality 128kbps MP3 in real-time.... as far as I know, the earliest computers able to play MP3's reliably would have had Pentium CPUs with MMX. Of course, there are now purpose-built chips with built-in decoders that do the job admirably, but silicon was not so cheap back in those days.