I think those answers are slightly off. It depends on exactly what adapters are used, and what modes the hardware can accept. All 3 scenarios could have analog at both ends, or DVI at both ends.
There are two possibilities for each case, depending on whether the DVI port on the monitor is capable of accepting an analog signal (which I think none of them do).
#1: if the monitor DVI can accept analog input, then you'd have VGA analog signalling on both ends. The DVI port on video cards is usually DVI-I, and so it sends out the analog signal on the appropriate pins. The adapter just connects those pins to the appropriate pins on a VGA plug, it's the same actual signal as if the card had a VGA port. The cable carries it to an adapter, which just reverses the ping conversion to send the signals to the analog pins on the monitor port. You lose a bit of signal quality due to the multiple connections being made, but that's only an issue at high resolutions and refresh rates.
If the monitor CANNOT accept a VGA input, then you have to have the more expensive active converter connected to the monitor DVI port, to convert the signal to digital input. In this case, it is analog VGA at the video card, and DVI at the monitor.
If you really wanted to, you could have digital output from the video card, an expensive converter connected to the VGA cable, and then another expensive converter at the other end to make it digital again. That'd cause a lot of image quality issues though from all the conversions.
Scenario 2 is essentially the same as 1, except that you don't have the adapter from the video card port. Again, it's an analog signal from the video card, and you need either a cheap adapter at the monitor end, to make it also an analog VGA signal, or you need an active converter to make it a DVI signal.
#3 is the only one that will be analog almost certainly at both ends. Analog output from the DVI-I connector, adapter to make the pin connection to the cable, to analog input on the monitor. You could again however make it digital output from the video card, and use an active converter to make it an analog signal for the cable. Obviously that'd be stupid. However if you had a digital only video output, DVI-D, then you'd have to do it that way.