Well, I guess I should tell ya the whole story. Truth is, I'm deciding between the Dell P1110 21" and the IBM P260. The biggest difference is the P260 offers a DVI-A input whereas the Dell has two VGA connectors. The P260 is also $10 cheaper. I've been researching on Google like mad but it doesn't seem anyone has really asked this question before.
With my current Hitachi SuperScan 753, it supports awesome resolutions and refresh rates, but when I go to 1600x1200x85 I noticed the image gets blurry and moire problems. I'm convinced this is because of the low bandwidth of the VGA cable. Also, here...
http://www.cadenceweb.com/2001/0801/cadlab_3dcards0801.html
states...
All of the cards feature at least one DVI-I connector, which includes a higher-quality analog signal than the old VGA port, to better support the higher resolutions offered on larger CRT monitors. All of the cards also offer a VGA connector except the Fire GL4, which eschews the VGA port for a second DVI-I port (two DVI-I to VGA adapters are included with the card for compatibility). Very few CRT monitors offer a DVI-I or DVI-A (the analog-only version) connector, but if it is available, connecting the monitor and graphics card with a DVI-A cable will offer a higher-quality image.
...which leads me to believe I am right. Note: I am not contradicting you or anyone here, it's just I want to be sure as I plan on running 1600x1200x85 and I don't want to be stuck with the same problem I have now (it's unuseable at that setting).
I also don't want to confuse the issue more, but what about BNC? This is just an aside question as I'm not getting a monitor that supports BNC, but I was thinking, the consensus is that BNC is better at higher resolution, but thinking aloud, how is this so coming from a VGA connector? Doesn't that bottleneck it?
Anyways, back to DVI-A... can anyone please elaborate on the issue?