Example 1 bit DAC on CD players.
How can you do anything without eight bits?
A 1-bit DAC is just a marketing name for a delta-sigma modulator. Wiki gives a good , if somewhat hard-going description of how this works.
The problem with DAC (and ADC) is that you need to filter the signal in the analog domain to remove high frequencies outside of the sampling range. (In ADC, the high frequencies will confuse the ADC which will render them as low frequencies. With DAC, high frequencies will be added to the signal artefactually, because a DAC creates a "stepped" waveform).
Good quality analog filters are difficult and expensive to build, need careful calibration and can drift out of calibration with aging, temperature, etc. Any drift in performance can lead to corruption of your desired signal or could allow artefacts of the conversion in.
If you can build a digital interpolator that expands a 40 kHz digital signal to e.g. 10 MHz, then you can use a digital algorithm in your interpolator that does not produce any spurious signals in the 20 Khz - 5 MHz range. You can then use a very basic analog filter to filter out the 5 MHz+ signal - as your filter only needs to operate miles away from your desired signal, even the crappiest analog filter will offer outstanding performance on this very easy job.
Building a 10 MHz DAC is harder than a 40 kHz DAC. But, that doesn't matter so much - you can use a much lower resolution DAC and use "dithering" (rather like FRC on LCD monitors) to simulate a higher resolution DAC, and let your analog filter clean up the signal (which even a crappy filter will do beautifully, because the noise you are intentionally adding to the signal is miles away from your deisred signal).
In short, if your interpolator can expand a 40 kHz signal to a 10 MHz signal, then by dithering you can reduce the resolution from 16 bits to 1 bit (i.e. on off) and still retain full quality.
This 1 bit DAC design is cheaper and simpler to construct than a "real" 16 bit DAC. A "real" 16 bit DAC often needed precision calibration (on-die resistors would be laser trimmed at the factory), leading to massive prices. I remember paying over $100 for a 192 kHz 16 bit DAC chip for a scientific experiment back in the mid 90s. Even with precision calibration these chips often suffered from defects (e.g. a non-linear response). By contrast, a 1 bit DAC needs no calibration - it's just a switch - and because of the digital signal processing it is absolutely perfectly linear in its response.
In fact, things go further than that. The noise produced by a "real" DAC is produced equally at all frequencies (it's pure white noise). If you tweak your digital interpolator slightly, you can have it bias the noise into the high-frequency range (where it will be filtered out), giving even higher fidelity than a "real" DAC could have produced.
The benefits of the delta-sigma design of DAC/ADC are so enormous, that with the exception of ultra-fast converters (e.g. RAMDACs, monitor ADCs, LCD panel DACs, and cell phone/base station radio ADC/DACs) virtually all converters on the market are now delta-sigma.
I'm told that modern top end audio DACs, have moved away from the "1-bit" construction, and they tend to use 2 bit or 4 bit conversion as the core of their delta-sigma systems (apparently, this can "shape" the noise even better, giving essentially no detectable noise within the audio band).