imported_Burkeman
Member
- Apr 19, 2004
- 89
- 0
- 0
Originally posted by: Seer
There is a huge difference with the sharpness and clarity of text when switching to dvi from vga on my brothers 19" 1280x1024
Originally posted by: jiffylube1024
Originally posted by: Seer
There is a huge difference with the sharpness and clarity of text when switching to dvi from vga on my brothers 19" 1280x1024
Ditto. I've used 17" and 19" LCD's primarily (but also 20" and 24" widescreens), and DVI makes a huge difference on text and browsing. DVI is crystal clear; even when you turn cleartype on a VGA LCD, it looks blurry compared to DVI.
Some video cards are better than others at making VGA look acceptable on an LCD, but DVI is always the better solution, and in many cases the difference in sharpness is huge.
Originally posted by: dug777
No need to get all defensive there mate. I was simply going on my personal experience, and clearly i'm not at all alone as you can see from others in this thread.
I am intererested to see a pic side by side, identical conditions however, as i said.
Originally posted by: Seer
Originally posted by: dug777
No need to get all defensive there mate. I was simply going on my personal experience, and clearly i'm not at all alone as you can see from others in this thread.
I am intererested to see a pic side by side, identical conditions however, as i said.
Sorry, I didn't mean to offend you or anything. Maybe it sounded confrontational, but I was just saying that there is a rather large difference.
Originally posted by: dug777
EDIT: off topic, when you use a DVI-VGA adapter, i assume the signal isn't converted through the card's DACs? (since it's a purely digital signal i assume) How does that work? Does the adapter actually do it? Does the monitor have a DAC? :headasplodes;![]()
Originally posted by: alpha88
Originally posted by: dug777
EDIT: off topic, when you use a DVI-VGA adapter, i assume the signal isn't converted through the card's DACs? (since it's a purely digital signal i assume) How does that work? Does the adapter actually do it? Does the monitor have a DAC? :headasplodes;![]()
The DVI socket can carry an analog signal. Nearly all video cards out put both analog and digital on their DVI port, the adaptor just converts the wires of the analog signal to Dsub.
Most DVI inputs on moniters only take the digital signal, though there are analog inputs as well. Digital only outputs also exist.
LCD's still can't match the best CRT's. Try your comaprison it on a good 19" or larger CRT.Originally posted by: butt head
Here's my 2c worth:
I am sitting in front of 2 Dell 2405 monitors (24", 1920 X 1200). One is connected via a DVi cable and another is connected with a vga cable. I can't speak for anyone else but I can tell you guys that the picture on both is identical and equally stunning. There is absolutely no difference in quality / detail / colour etc. I have swapped cables and switched one monitor from DVi to VGA etc and I can confirm that absolutely no one in my team (I'm a graphic designer, 20 years in the business) can tell the difference!!
I know that this should not be so, but it is.
BH
Originally posted by: Harvey
LCD's still can't match the best CRT's. Try your comaprison it on a good 19" or larger CRT.Originally posted by: butt head
Here's my 2c worth:
I am sitting in front of 2 Dell 2405 monitors (24", 1920 X 1200). One is connected via a DVi cable and another is connected with a vga cable. I can't speak for anyone else but I can tell you guys that the picture on both is identical and equally stunning. There is absolutely no difference in quality / detail / colour etc. I have swapped cables and switched one monitor from DVi to VGA etc and I can confirm that absolutely no one in my team (I'm a graphic designer, 20 years in the business) can tell the difference!!
I know that this should not be so, but it is.
BH
Originally posted by: ProviaFan
There is most certainly a noticeable different on my Samsung 213T (at 1600x1200, of course).