Why no DVI on CRT's?

Scootin159

Diamond Member
Apr 17, 2001
3,650
0
76
I know DVI is pretty widly available on LCD's, but why not CRT's? Wouldn't it make sense to have a DAC inside the monitor, rather than the video card (as it could more closely match the monitors capabilities)? This would also eliminate any interferance experienced due to poor quality cables (and KVM's). Does DVI still use the video card's RAMDAC? If not, this would also allow for poorer quality cards to provide better 2D image quality (as well as 3D, but it's not as much an issue there). There is always the argument that DSUB is much more prevelent than DVI, but if manufactuers offered DVI on their monitors, we would quickly see DVI being much more prevelent on video cards. You could also argue that this would raise the cost of a monitor - but how much? I doubt it could be more than $20.
 

RedShirt

Golden Member
Aug 9, 2000
1,793
0
0
That is a good idea...

However, I don't think you'll see it. Since no videocard manufacture would risc not putting a DAC on a card, and no monitor manufacture is going to make their monitor 20 dollars more expensive than the competition.
 

Scootin159

Diamond Member
Apr 17, 2001
3,650
0
76


<< That is a good idea...

However, I don't think you'll see it. Since no videocard manufacture would risc not putting a DAC on a card, and no monitor manufacture is going to make their monitor 20 dollars more expensive than the competition.
>>



The no DAC makes sense on the vid. card side, but I think the $20 more for the monitor would be accepted (especially on higher-end monitors - what's $20 on a $500 monitor?).
 

luv2chill

Diamond Member
Feb 22, 2000
4,611
0
76
Check out the IBM P260 and P275 monitors. I hoping to pick one up soon.

21" ------ check!
Stealth Black ----- check!
FD Trinitron --------check!
DVI-I Connector ------- check!

l2c
 

Smbu

Platinum Member
Jul 13, 2000
2,403
0
0
yeah, the 21" FD Trinitron IBM P260 and P275 monitors have a DVI connector in addition to the standard HD15 connector.
Too bad my 19" FD Trinitron IBM P96 only has 2 HD15 connectors.:(
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
The limitation per the DVI standard is a 162 MHz pixel clock, That's high enough to support 1600 x 1200 at 60 Hz or 1280 x 1024 at 85Hz (162 MHz happens to be the VESA- standard clock for that timing for CRTs).

However not all DVI implementations can support the full 162 MHz clock,

The pixel clock limits the maximum resolution and refresh rate of the monitor. DVI would work fine for the lower resolution 17 and 19 inch CRT monitors, that are only capable of 1280 x 1024 at 85Hz. The current implementation of DVI cannot support the larger 21? or higher end 19? that run 1280 x 1024 at 100Hz or 1600 x 1200 at 85Hz.

Higher speed DVI chips are in development, however to upgrade you will need to replace both the monitor and video card.

Also the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a chip to the VC and the monitor this adds cost to both.
 

Jerboy

Banned
Oct 27, 2001
5,190
0
0


<< The limitation per the DVI standard is a 162 MHz pixel clock, That's high enough to support 1600 x 1200 at 60 Hz or 1280 x 1024 at 85Hz (162 MHz happens to be the VESA- standard clock for that timing for CRTs).

However not all DVI implementations can support the full 162 MHz clock,

The pixel clock limits the maximum resolution and refresh rate of the monitor. DVI would work fine for the lower resolution 17 and 19 inch CRT monitors, that are only capable of 1280 x 1024 at 85Hz. The current implementation of DVI cannot support the larger 21? or higher end 19? that run 1280 x 1024 at 100Hz or 1600 x 1200 at 85Hz.

Higher speed DVI chips are in development, however to upgrade you will need to replace both the monitor and video card.

Also the Ramdac (chip that generates the video signal on the VC) has been integrated into the graphics controller chip for years now. Adding DVI means adding a chip to the VC and the monitor this adds cost to both.
>>



The reason LCD can use DVI(Digital Video Interface) is it operates digitally. CRT's must analogly operate the electron beam and draw it on the screen.

check out this site.. Its quite informative.
http://www.neuro-logic.com/digital.htm
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Many LCD's use the stadard VGA (analog) interface. Simply a function of where you put the D / A converter.

Even monitors with a DVI interface convert to analog at the LCD driver level. The digital signal must be converted to an analog in order to achieve the 16M colors. If LCD was pure digital only two colors, black and white would be achievable. In order to generate the 16M colors each red, green and blue cell must be capable of stepping through 256 shades this is an analog function. In fact, most LCDs maintain the video signal in analog form through to the pixel drivers (NEC was the most notable producer of these).