If so, then what are the most commons DACs used? Does the DAC quality in digital monitors not really matter that much? What type of DAC does the Apple LED Cinema 24 in use?
Also, are the terms Digital-analog convertor and video encoder synonymous?
The type of DAC varies in digital monitors. Cheaper monitors use cheaper DACs.
For example low end LCDs use 6-bit DACs - these can create 64 different brightness signals for the pixels - giving a total of 262,144 possible different colors. Your standard budget LCD monitor is like this (although these monitors use some tricks that can help make the images look better)
High-end LCD use 8-bit DACs, allowing the monitor to display 16.7 million colors. (The apple cinema display uses 8 bit DACs).
Pro-level LCDs use 10 (or higher)-bit DACs, allowing billions of colors (or very precise adjustment of a smaller number of colors). Most graphics cards and OSs only support 8-bit colors, so there is very little benefit in using a 10-bit monitor - only high-end graphics design, medical systems, etc. need 10-bit.
A digital-to-analog converter and video encoder are totally different things.
A DAC takes a digital signal (e.g. a number stored in RAM, which contains the brightness for a pixel), and converts it to an analog voltage (which is sent to the actual pixel on the screen).
A video encoder takes a digital signal, corresponding to many pixels. It then analyses the signal to work out what humans can see, or not - and then throws away the data it thinks humans can't see very well. The result is a stripped down digital signal which is smaller and easier to store or transmit.