Radeon X850 w/ a CRT Monitor?

piromaneak

Senior member
Feb 27, 2005
225
0
0
I've noticed that the X850 has Dual DVI ports only so I'm wondering are their any CRT monitors that have DVI connections or are their any adapters that so that I can use a CRT monitor with an ordinary D sub 15 pin VGA connection with the X850? I DO NOT want to get an LCD so don't even try to convince me heh... I love my CRTs. Any feedback appriciated.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
I believe that the DVI ports are DVI-I meaning they contain DACs so you can use those cheap adapters (which should be included with the card) to run a CRT through VGA.
 
Mar 11, 2004
23,444
5,852
146
I think only the X850XT PE has dual DVI.

And yes even then you could do it, as they bundle at least one DVI to VGA adapter.
 

piromaneak

Senior member
Feb 27, 2005
225
0
0
Ahh, thx mega works for pointing out the Adapters in the kit... Now im wondering do you lose the DVI quality now that the signal has to go thru that adapter and the cable or is it still a better picture than just a straight VGA signal?

More importantly, do those adapters work reliably?
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Some people say you do but if it exists it's *very* slight but then again they could be imagining things as all that little adapter does is convert the DVI port to a VGA out, the DAC on the card does the actual DAC.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: Bar81
I believe that the DVI ports are DVI-I meaning they contain DACs so you can use those cheap adapters (which should be included with the card) to run a CRT through VGA.

i think i had two adapters...hate them..would rather have one vga output...LCDs suck....and now i have 1 or 2 inch more sticking out at the back of the PC for the stupid adapter
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Now im wondering do you lose the DVI quality now that the signal has to go thru that adapter and the cable or is it still a better picture than just a straight VGA signal?

Do you have a high end CRT? If so then the DVI-VGA converter will be easily inferior to straight VGA, forgot coming close to being better. The higher the bandwidth requirements for a given setting the larger the hit in quality is. For reference, my CRT has dual inputs and even using non enthusiast eyes(in other words- people besides myself) the signal degredation is clear- everyone has no problem saying straight VGA is clearly superior in quality to DVI-VGA signal when pushing high settings.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Do you have a high end CRT? If so then the DVI-VGA converter will be easily inferior to straight VGA

I disagree. I've used an adapter with my AIW cards for years on my high-end monitor with dual inputs, and I've never noticed any loss of quality over straight VGA connected. I've compared it directly with several models of S3, Intel onboard, ATI and Nvidia cards and several different PC's with both my 22" CRT and my 55" Widescreen at high resolutions...its virtually indistinguishable.
 

SneakyStuff

Diamond Member
Jan 13, 2004
4,294
0
76
Trust me, you're not going to notice, my cousins and I have used dual CRT monitor setups on GeForce, and Radeon cards over the years, with VGA and DVI connections. Using the adapter for the secondary monitor, I can tell you from my own experience, that there is no visable difference. So I really wouldn't worry about it :)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I've compared it directly with several models of S3, Intel onboard, ATI and Nvidia cards and several different PC's with both my 22" CRT and my 55" Widescreen at high resolutions...its virtually indistinguishable.

Set your monitor to 1600x1200@100Hz and see if you can say the same.
 
Mar 19, 2003
18,289
2
71
Originally posted by: BenSkywalker
I've compared it directly with several models of S3, Intel onboard, ATI and Nvidia cards and several different PC's with both my 22" CRT and my 55" Widescreen at high resolutions...its virtually indistinguishable.

Set your monitor to 1600x1200@100Hz and see if you can say the same.

This is a bit OT, but when I used to have a CRT that could do 1600x1200 at 100/109Hz, it was noticeably blurry on my 6800GT no matter what. (Either the DVI output with an adapter, or the VGA output; a friend's 6800OC gave me the same results; and so did a replacement monitor.)

What would be the cause of that? Crappy DACs? :confused:

It doesn't matter to me much anymore, since I use an LCD over DVI, but I'm still curious.