dszd0g,
Referring back to your timing diagrams: I'm not sure if those are correct or not, I'll take your word for it. However, the problem I see with that theory is that the technology behind the LCD input system is different than say a CRT. There are most likely a number of registers and memory to hold the serial input from the DVI (and most likely a ADC to convert a D-SUB).
From what I rememver, the video card sends clocked data to a TDMS transmitter, which serializes the data from 24-bits of color (rgb), Hsync, Vsync, and a few others that are currently unused into a high-speed 3-bit differential trasmittion line, plus clock. Actually, the transmitter is divided up into three logical encoders (red, green blue) of 8 bits each (24bits total). There are two controll signals per encoder (6 total), and a global clock. of the six control signals, only two are used (the others are driven low for compliance and for future support), and these two are Hsync and Vsync.
When the TDMS receiver recovers this data stream and convertes it back to a 24-bit RGB and Hsync and Vsync. As far as I understand, the only reason why hsync and vsync are required for spec is because DVI is compatible with VESA standards, and if we had a digital input CRT, we would need the hsync and vsync signals. LCDs, for purposes for or against, do not need the hsync and vsync signals (they use the clock for synchronization). This is why I believe changing the frequency in Windows will have no effect if using an LCD with a DVI interface...
Now, I'm not sure about analog, because the LCD has to have a circuit to encode the analog singal to digital... I do not know if raising the refresh-rate would allow a better sampling rate or not... I haven't looked into it.
Let me know if I am not making any sense!