Analogue connection, when will it be obsolete and replaced by dvi ?

littlegohan

Senior member
Oct 10, 2001
828
0
0
I will be purchasing a lcd monitor soon and I dont know if I should buy one with dvi connection. I am planning to use this monitor for a fairly long tim. If analogue output is no longer on a videocard 5 years from now, it will render an analogue input only lcd monitor to obsolescent.


By the way, I heard that amongst different video cards, 2 d quality can vary greatly. However, if I use dvi output, the signal should be able to bypass the filter and picture quality outputed from the different cards should look the same on the lcd right ?
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: littlegohan
I will be purchasing a lcd monitor soon and I dont know if I should buy one with dvi connection. I am planning to use this monitor for a fairly long tim. If analogue output is no longer on a videocard 5 years from now, it will render an analogue input only lcd monitor to obsolescent.

I expect HD15 analogue connections to survive for a long time yet - all professional screens are still CRT - that is unlikely to change in the next 3 years.

By the way, I heard that amongst different video cards, 2 d quality can vary greatly. However, if I use dvi output, the signal should be able to bypass the filter and picture quality outputed from the different cards should look the same on the lcd right ?

NO - there are large variations in 2D image quality Radeons are great, all matrox cards do great 2D, thouth nVidia cards don't
The R9700 looks to be the best yet.

get the DVI however because the image quality will be slightly higher anyway.

What TFT you getting??

Remember - resolution AND response time matter on TFTs.
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: DeRusto
I was under the impression that the DVI-I version of the connector could support analog CRT connections.

Anandtech - Digital Visual Interface (DVI) Explained

Wouldn't this mean that CRTs could begin using the DVI connection as a replacement and all existing monitors could simply use a connection converter of some sort?

Yes, but that won't improve image quality!

DVI-I / DVI-D when used with a digital flat panel will ensure that no image quality is lost due to cross talk and ADC and DAC etc.

The actual image quality is dependant on the display controller and other GPU/VPU stuff and the screen itself, using DVI-I with an adaptor for CRTs will work but no point unless its a second head or your g.card only has DVI outputs.


Oh crap ive explained this really badly! and its not exactly what the thread is about!

Just get DFPs with DVI-I or DVI-D

keep CRTs on HD15 or 5BNC, adapters can be used but no benifit.

 

DRGrim

Senior member
Aug 20, 2000
459
0
0
Originally posted by: Bothware

Radeons are great, all matrox cards do great 2D, thouth nVidia cards don't
The R9700 looks to be the best yet.
nVidia cards used to have a little problem with image quality, but it is much improved now. As for the best, Matrox's Parhelia is supposed to have much higher quality filters then any other video card, explained here.

As for the question, I would get a monitor that supports DVI. The LCD I have has changeable cables, so right now I have the VGA cable in, but if I get a card with DVI out I would just switch cables. Hope this helps.
 

Bothware

Member
Jul 26, 2002
25
0
0
Although high refresh rates are good, those for the R9700 are more than high enough, the link above is an article which judges IQ purely on refresh rate - unless ive misinterpretted it - the IQ to me is what it LOOKS like!
 

DRGrim

Senior member
Aug 20, 2000
459
0
0
Originally posted by: Bothware
Although high refresh rates are good, those for the R9700 are more than high enough, the link above is an article which judges IQ purely on refresh rate - unless ive misinterpretted it - the IQ to me is what it LOOKS like!
Yes, you have misinterpretted it. Perhaps I should have linked to the first page in the image quality portion of the article. Sorry about that, if you read that over it should help you to understand better.
 

DeRusto

Golden Member
May 31, 2002
1,249
0
86
Originally posted by: Bothware
Originally posted by: DeRusto
I was under the impression that the DVI-I version of the connector could support analog CRT connections.

Anandtech - Digital Visual Interface (DVI) Explained

Wouldn't this mean that CRTs could begin using the DVI connection as a replacement and all existing monitors could simply use a connection converter of some sort?

Yes, but that won't improve image quality!

DVI-I / DVI-D when used with a digital flat panel will ensure that no image quality is lost due to cross talk and ADC and DAC etc.

The actual image quality is dependant on the display controller and other GPU/VPU stuff and the screen itself, using DVI-I with an adaptor for CRTs will work but no point unless its a second head or your g.card only has DVI outputs.


Oh crap ive explained this really badly! and its not exactly what the thread is about!

Just get DFPs with DVI-I or DVI-D

keep CRTs on HD15 or 5BNC, adapters can be used but no benifit.

Perhaps I did not explain my point well enough.:) What I meant was that the HD15 connection could easily be replaced so that future CRT's connect through DVI-I. I was just thinking that perhaps all monitors could use the one connection rather than having two on the video card. This is what I see as (hopefully) happening in the near future.

 

Bothware

Member
Jul 26, 2002
25
0
0
NO it wont happen because of the maximum resolution support.

However some CRTs now have DVI in addition to HD15, I dont think anyone would ever make a CRT with only DVI but if they did it would not be bigger than 19", unless we see a new DVI, but that would not be backward compatible anyway.
 

Locutus4657

Senior member
Oct 9, 2001
209
0
0
DVI will replace analog when an LCD monitor with a quick enough responce rate to be used in games and other such applications is invented. So far LCDs just arn't fast enough to be useful to gamers and to many proffessionals. There is a significant variance in "2D" image quality. The Matrox Paraphilia (sp?) seems to be the best right now. Matrox traditional excels in what is refered to "2D" image quality.

Carlo

Originally posted by: littlegohan
I will be purchasing a lcd monitor soon and I dont know if I should buy one with dvi connection. I am planning to use this monitor for a fairly long tim. If analogue output is no longer on a videocard 5 years from now, it will render an analogue input only lcd monitor to obsolescent.


By the way, I heard that amongst different video cards, 2 d quality can vary greatly. However, if I use dvi output, the signal should be able to bypass the filter and picture quality outputed from the different cards should look the same on the lcd right ?

 

DeRusto

Golden Member
May 31, 2002
1,249
0
86
So, the DVI connector itself has a low maximum supported resolution? I thought that it was just the digital output only had enough bandwidth for certain resolutions.

From what you are saying, it would seem that a card with only DVI-I connectors (like the Matrox Parhelia) would not be able to support high resolutions on any standard CRT. That doesn't make any sense, as a business application of this card would more than likely need very high resolution support. If the DVI connector itself is plagued with an inability to support high resolutions, then this card is a very poor move on Matrox's part.

If that is true however, that a DVI connector cannot display high resolutions, then I can understand why the DVI connector will not replace the HD15 connector.

But, if the DVI connector could display resolutions, on the analog component, just as high as a HD15 connector, then it would seem a good idea to use the DVI for all monitors.
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: DeRusto
So, the DVI connector itself has a low maximum supported resolution? I thought that it was just the digital output only had enough bandwidth for certain resolutions.

From what you are saying, it would seem that a card with only DVI-I connectors (like the Matrox Parhelia) would not be able to support high resolutions on any standard CRT. That doesn't make any sense, as a business application of this card would more than likely need very high resolution support. If the DVI connector itself is plagued with an inability to support high resolutions, then this card is a very poor move on Matrox's part.

If that is true however, that a DVI connector cannot display high resolutions, then I can understand why the DVI connector will not replace the HD15 connector.

But, if the DVI connector could display resolutions, on the analog component, just as high as a HD15 connector, then it would seem a good idea to use the DVI for all monitors.

DVI-I has extra pins through which to transmit analogue signals. When used with a DVI-I to VGA converter the DVI-I can provide high resolutions, although a dedicated VGA port is debatably better. However when used to power an LCD/TFT screen the resolution is limited to a mere 1280x1024 in digital or 1600x1200 in analogue/digital.

DVI-D I believe was limited to 1600x1200 but can know do 1920x1200 thanks to higher frequency signals from TDMS.

DVI-I is more expensive than a VGA port and requires a TDMS chip - then convertors to VGA/HD15 also cost money.

There are other variations of DVI but the others are nowhere near mainstream.

You could use DVI to power an analogue monitor, but resolution limits would still apply: the only screen I know of using this is the Iiyama 453 which has been discontinued, and the replacement model does not support it. I think Sony has released a screen for the US market using it in addition to HD15 but not sure.

Theres no reason why Graphics card manufacturers couldn't provide a propriety connection on the cards to a breakout box with 2 or even 3 of every connector - although it would cost a bit more than normal.

We'll just have to wait and see what happens.

One last point: You may think but DVI-I know to future proof but dont bother because when DVI hits the big time the spec will have improved and run at higher frequencies!!!
 

DeRusto

Golden Member
May 31, 2002
1,249
0
86
Well, i suppose that all makes sense.:)

When the DVI DOES hit the big time, I hope then it will be a probable replacement for the current standard..just for simplicity's sake.
 

Bothware

Member
Jul 26, 2002
25
0
0
Originally posted by: DeRusto
Well, i suppose that all makes sense.:)

When the DVI DOES hit the big time, I hope then it will be a probable replacement for the current standard..just for simplicity's sake.

It certainly will be - unless they devise something new instead of an updated DVI.
We need high res, low response time screens at affordable prices first though.
Graphics cards could easily cope even as they are today.