why no dvi crts?

dpopiz

Diamond Member
Jan 28, 2001
4,454
0
0
why don't crts start using dvi and just have an internal dac? not only would the image be slightly better quality at high res, but then video cards could switch to dvi-only
 

bex0rs

Golden Member
Oct 20, 2000
1,291
0
0
Refresh rates don't go very high at high resolutions using DVI.

~bex0rs
 

JeremiahTheGreat

Senior member
Oct 19, 2001
552
0
0
its a chicken and egg problem..

Manufacturer's won't start putting DVI on their CRT's until DVI is standard on video cards (or that analogue is not there anymore) and Video card manufacturers won't take off the normal port unless all monitors have DVI.. or something like that..

And, don't forget, that Monitor manufacturers want to reduce costs so DVI inputs would probably be left to high end monitors.. blah blah..
 

ADxS

Member
May 26, 2001
81
0
0
You probably wouldn't want to use DVI on a high end monitor anyway, since its going to limited to 165MHz of video bandwidth. Even a double-link is going to fall well below the max afforded by today's 350 and 400MHz DACs.
 

dpopiz

Diamond Member
Jan 28, 2001
4,454
0
0
actually, to make it even easier, why not just standardize on DVI-I? It shouldn't be any more expensive to add DVI-I to crts because it's just analog
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Originally posted by: ADxS
You probably wouldn't want to use DVI on a high end monitor anyway, since its going to limited to 165MHz of video bandwidth. Even a double-link is going to fall well below the max afforded by today's 350 and 400MHz DACs.

400 MHz? Now when would you want to use THAT? Do the math - let's say you want to do 1920x1280 at 85 Hz. You'll need 1300 lines and about 2400 pixels total for that, so you'll end up with a pixel clock of around 265 MHz.

DACs should have headroom so as to give sharp edges - but on a digital signal this is not needed. So it'll be a while until 2x 165 MHz DVI will limit anyone. And besides, how good do you think will a 400 MHz analog signal look after travelling 1.5 meters of cable?

regards, Peter
 

ADxS

Member
May 26, 2001
81
0
0
You miss my point: that 1920x1280 at 85Hz - your example, not mine - is already way beyond what a single-link DVI can deliver.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
So? The fact that you shouldn't try to run a $2000 monitor on a $50 graphics card is hardly surprising.
 

Zen0ps

Member
Feb 13, 2002
27
0
0
A high Mhz Ramdac like a 400Mhz Ramdac does not go as high as you might think. About 2048x1536 @ 85Hz is the maximum for that speed Ramdac.

It's an analog converter, and as all electronics people eventually find out, you can *never* run an analog converter at 100 percent of its rated speed. Same idea as a stereo system, it may be rated for a certain RMS wattage, but turn up the dial to 70 to 90 percent of the rated maximum and in most cases its starts to distort.

I too would like to see a CRT manufacturer use a DVI input, and then convert inside the monitor (so there is no signal loss by going through the 6 foot VGA cable and connectors) But the thing is, noone makes a really good third party computer video Ramdac over 350Mhz. Matrox uses their own, and ATi integrates theirs into the main videochip itself.
 

ADxS

Member
May 26, 2001
81
0
0
Originally posted by: Peter
So? The fact that you shouldn't try to run a $2000 monitor on a $50 graphics card is hardly surprising.

$50 graphics card? 1920x1440x75Hz is not an uncommon res on a 21" CRT. What $500 graphics card do you suggest that will do that?



 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
It's not like there aren't any ... take Matrox Parhelia for example. Both DVI outs on that one are dual channel. And you even have $100 left then :)

Whether or not some of the more mainstream graphics cards implement the double-link-ness on the DVI output, who knows ... card specifications are mostly blurry on that issue.

There are quite a few VGA chips that, besides the integrated single channel TMDS transmitter, have an alternate path to an external one that may then be dual channel capable. Knowing that, and seeing that mainstream VGA cards do not have separate transmitter chips on, it's all single channel there, it seems.

Besides, even single channel TMDS has a two-pixels-per-clock mode that doubles the pixel throughput at the expense of color depth. This is a very common thing on large TFT panels.

An additional thing that is being done right now is the migration to "reduced blanking period" timings on both ends, reducing the amount of time spent in horizontal/vertical beam flyback. This brings the pixel frequency of a given resolution and refresh rate down - current CRTs typically waste 1/3 of the signal bandwidth in the front and back porch of the actual image signal. That's quite a bit of headroom waiting to be squeezed out.

regards, Peter
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
CRT monitors are analog devices (so are LCD?s, bit that?s another topic). DVI does not have the bandwidth to support higher than 1280 x 1024 at 85 resolutions.

The Ramdac has been integrated into the video graphics chips for many years now, thus the cost is very low. Adding DVI means adding a DVI receiver chip into the monitor, and a Ramdac, simply makes no sense since the benefit would not outweigh the cost.

ViewSonic did introduce some CRT?s with DVI and promptly pulled them from the market.

 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Isn't it funny how there's always someone who comes late to a thread, claiming stuff that has long been contradicted further up? Read first, then post. PLEASE.
 

dpopiz

Diamond Member
Jan 28, 2001
4,454
0
0
I really don't think it would be too horribly expensive, and I really wish for a lot of computer things, that people would just accept paying a few more dollars in order to help switch to a new, better, simpler standard



Peter:
SECONDED!
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
I would argue that it is not a better simpler standard. VGA is a better simpler standard for CRT monitors, if you think the video image would be better, think again, it would not. CRT monitors are analog not digital. Also they dont make ramdac chips anymore so the monitor manufactuers couldt do it even if they wanted to.

 

ADxS

Member
May 26, 2001
81
0
0
Originally posted by: Peter
It's not like there aren't any ... take Matrox Parhelia for example. Both DVI outs on that one are dual channel. And you even have $100 left then :)

Lol. Peter, I should have been more clear: I actually have the Parhelia, connected to a 22" NEC digital CRT. The highest resolution and refresh rate available is 1280x1024x75Hz - no ifs, ands or buts about it.

Now, I am perfectly willing to accept the argument that the Parhelia may suck (though it was *your* example), and/or that my 22" NEC monitor sucks, but if the pricey Parhelia cannot drive a pricey 22" NEC to higher than 1280x1024x75Hz via DVI, what hope is there for DVI CRTs in the low- and mid-range? :)
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Originally posted by: Gosharkss
I would argue that it is not a better simpler standard. VGA is a better simpler standard for CRT monitors, if you think the video image would be better, think again, it would not. CRT monitors are analog not digital. Also they dont make ramdac chips anymore so the monitor manufactuers couldt do it even if they wanted to.

Then how come this IS being done on high end monitors? And also, how come the image quality IS noticeably better when you use the same monitor with the same display settings on digital DVI rather than analog VGA?

There is a benefit in having the signal travelling the cable in a lossless digital fashion. It's real, the technology exists, people have it on their desks. Get over it.


ADxS, now does that NEC thing not implement dual channel DVI or what? Now that would be a silly move ... Else you should be talking to Matrox and NEC to make them figure out what's going wrong . Point being, VGA firmware/drivers and display unit must come to a consensus about activating the 2nd channel. Maybe there's something fishy either in the Parhelia BIOS or drivers, or in the monitor's EDID data set. Given the rarity of dual channel DVI actually happening, I wouldn't be very surprised if there were some work left to do in that area.

regards, Peter
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Those who are doing it NEC the most notable, use a discrete D to A converter. Having designed these myself I know that this is the worst possible scenario when it comes to keeping the video signals clean. The best way is an integrated on chip solution like the ones in all of today?s video cards.

On a CRT monitor the video signal and interface connections are the least of our worries when it comes to better video. Convergence, focus and geometric distortion are much larger problems on a CRT as compared to the video interface.

Best scenario is an analog CRT monitor with a built in video cable, where the monitor side video cable is soldered directly to the video amp PCB. This is the least amount of connections and provides the best video signal possible to the CRT itself.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
NEC must have done a really bad job in handling the analog input then ... since there is such a noticeable difference in image sharpness when switching from digital to analog input.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Remember something else Peter, CRT monitors are for the most part hand tuned at the factory, thus they are like fingerprints and no two are exactly alike, maybe you got a good one. Also why would you buy a nice big 22? NEC and run it at 1280 x 1024 resolution, seems a waste to me.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
(Note: I don't own one personally, I just had a look :))

That's question #2 we're wrestling here, where's the 2nd channel gone in the combination of that NEC monitor and the Parhelia? Who ate it? I know for sure Parhelia has it, so why doesn't it get used? Does the Parhelia BIOS and/or driver not figure out the NEC monitor has it? Is the monitor's EDID information not representing the capability correctly? Or does the monitor not even implement it to start with (which, then, would REALLY be a stupid move)? Let's have Matrox and NEC figure this one out ...

regards, Peter
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Peter

It is my belief that the Matrox card, like all the others support DVI-I single link only. The single link has become the standard because the bandwidth is sufficient for ALL LCD monitors on the market today and even looking into the distant future. Single link support 1600 x 1200 at 60hz which is more than enough for 99% of the LCDs on the market today.

Dual Link adds more cost to both the video card and monitor, unfortunately it always comes down to unit cost pure and simple. Also the manufactures (me included) do not think the benefit of DVI on a CRT out weighs the cost.

The monitor / Video card market is extremely competitive even a few cents added to the cost of a product cuts significantly into the manufactures margins. You can see the list of dead video card manufactures as testament to this.

 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Well the Parhelia DVI output, at least one of the two, does have twin channels. The card even comes with a dual-headed breakout cable that lets you use the 2nd channel individually, for a 3rd display.

I agree, if you use an LCD panel, 165 MHz gets you quite far ... even further than 1600x1200 with reduced-blanking timings.