If a video card is DVI only, does the internal RAMDAC come into play at all?

NFS4

No Lifer
Oct 9, 1999
72,647
26
91
If a video card is DVI only, is there any reason why the internal RAMDAC would be needed? Even if you run an Analog monitor with an external DVI --> Analog dongle? Would the RAMDAC come into play at any point or would it still be running off the digital signal?
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0
All videocards process digital information.
Analogue monitors cant process digital info.
Videocards must use a ramdac (RAM digital to analogue converter) to
export useable signal to an analogue monitor
DVI out cards MAY have both 15 pin dsub analogue and DVI-D
Or may have single or dual DVI-I combined output.
Traces to connector determine whether RAMDAC is run thru.
If your analogue monitor has a picture - its been run thru a DAC.
Some LCD's utilize analogue output signals, some utilize digital, some both.
Analogue mode plug in for LCD only if it has an onboard ADC.
So the digital video signal goes thru ramdac, changed to analogue,
reconverted to digital by on monitor ADC. (heh heh)
Thats why there are three type connectors : DVI-I .......DVI-D.....DVI-A
LCD does not necessarily mean digital signal from card.
Your monitor must say DVI-I interface in its spec sheet.

All music starts out as analogue.
It may now be recorded digitally - by using an ADC (analogue to digital)
A CD player is only digital.
All CD players are two fold - a transport and a DAC
High end Hi-Fi sells them separately.




 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0

Note the following:
The Digital Visual Interface (DVI) was developed by the Digital Display Working Group (DDWG). The lobbyists behind DVI include many companies that were originally involved in DFP. Although it has not been accepted as a standard by VESA, DVI has a very good perspective for the future because the digital transfer protocol is still TMDS (PanelLink). In comparison to P&D and DFP, which only have one link, DVI incorporates a second link, which doubles the maximum pixel rate. This allows resolutions over 1280 x 1024 pixels. A further advantage of DVI is the fact that analog signals can also be transferred. Therefore, older cathode ray tube monitors can still be connected if needed.

link 1



Also note here the two connectors:
DVI-V (commonly called DVI-D for digital) 24 pins
DVI-I 24 pins (digital) + 4 analogue


here
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0

this shows the pinouts for the current connectors.

also......
(Quote)

An acronym for Digital Video Interface. DVI is a concerted effort specification created by the Digital Display Working Group (DDWG ) to accommodate analog and digital monitors with a single connector. The Digital Display Working Group is an open industry group lead by Intel, Compaq, Fujitsu, Hewlett Packard, IBM, NEC and Silicon Image. The objective of the Digital Display Working Group is to address the industry's requirements for a digital connectivity specification for high-performance PCs and digital displays. There are three different DVI configurations. DVI-A, designed for analog signals, DVI-D, designed for digital signals, and DVI-I the mutually integrated specification, designed for both analog and digital signals. This connection is NOT compatible with the 20 year old VGA 15 pin connector. It is an entirely different connector that is beginning to show up on display cards and monitors as of late 2001. The original signal is pure digital but when transmitted to a DVI interface through the appropriate port, the digital signal is converted to analog if asked to do so by the program or by the monitor.

and this link talks a bit about single and dual link connectors for higher
resolutions:



link
 

Bozo Galora

Diamond Member
Oct 28, 1999
7,271
0
0

and a liitle about the double conversion on a non-digital in LCD:

In thin film transistor displays, up to four transistors control each pixel (or, picture element) on the display. Each pixel is a digital bit of data that should correlate perfectly to the digital video data being generated by the computer, and therefore the image should have no chance to distort.

This is in marked contrast to the CRT, which was at its heart an analogue device, and required the video card to convert the digital information into an analogue signal in order to display it. As the thinking went, if you kept the signal digital all the way to the flat panel, you'd get a superior image.

But digital monitors have not won the day--at least not yet--for a few reasons. Because the analogue D-Sub connector was the common link between graphics cards and monitors, the move to a straight digital interface, which requires a different connector, didn't automatically happen when flat panels became available. Graphics card manufacturers weren't inclined to move to a straight digital card because it would limit their market to just digital displays. By the same rationale, the manufacturers of flat-panel monitors weren't inclined to go strictly digital because it would limit the number of graphics cards the monitors could be used with.

And so, for the most part, both groups stuck with the analogue D-Sub connector, and when there was a concession made to digital, the resulting product was a hybrid model containing both analogue and digital, thus adding to the cost.

Initially, the cost of adding DVI (digital video interactive) to a monitor or graphics card was substantial, another reason manufacturers opted solely for the widely used D-Sub connector. Recently, the price differential has come down, which has had two effects. First, there's now more of an incentive to include both on a product, and we've seen more graphics cards with both connectors onboard. The flipside is that the price of adding both connectors isn't great enough to justify getting rid of the D-Sub connector at the monitor end, which means that for the short term, most flat-panel monitors that incorporate DVI will be hybrid models that also retain the D-Sub.

But what about image quality of analogue versus digital displays? While the image of early analogue flat panels was often marred by shimmy and shake (an artifact of the double conversion from digital to analogue, then back), newer flat-panel monitors feature much better analogue conversion, to the point where many users can't even tell the difference.

According to Mickey Mantiply, display products line manager with IBM, subjects in one test were asked to choose which image they liked better--a digital one or an analogue one displayed on the same monitor. Opinion was split right down the middle--certainly not a compelling argument for the superiority of either technology, and little justification for making the move solely to digital, just yet.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Short answer:

In a DVI only card such as the AIW Radeon cards, the DVI>VGA dongle just uses the analog pins from the DVI connector, so yes the RAMDAC is definately used in analog mode.
 

gregor7777

Platinum Member
Nov 16, 2001
2,758
0
71
NFS4,

You may want to rename this thread "All you need to know about DVI-* by Bozo Galora. :)

Sheesh that's alot of info.