• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

DVI or VGA?

knowley

Senior member
Hi people

I have got a new Hitachi 17" TFT.

It has a max refresh of 60hz when in DVI mode and 75hz in VGA mode at 1280 x 1024.

According to this Review the monitor displays more colours when in VGA mode.

So the question is, would you use DVI (no digi to analogue conversion) or VGA (higher refresh rate and more colours) for playing games????

Thanks very much for your input!
 
I don't think the difference in refresh rate on LCD is noticable, since their response rate isnt that lower anyway.

And the digital/analog conversion isnt that big of a deal either, most monitors handle it well.

for gaming, it's really up to you. try both and see which one looks prettier to you. If what the review says is true, that vga gives more color, your'd probably wanna go with vga

the digital/analog problem is more noticable outside of games. most noticably when scrolling on webpages, you get the streaking or whatever they call it.

so my oppinion is, VGA for game, DVI for regular use....
 
Thanks for that m8... so theres no hard and fast rule then? - I always thought DVI was the newer better technology, but theres less options on the monitor in DVI and the colours are less as well < so whats the point?

Its gunna be one connection for gaming machine, and the other connection for my work machine /cough when I work 😉
 
Originally posted by: knowley
well its gunna be one connection for gaming machine, and the other connection for my work machine /cough when I work 😉

haha, my bad 🙂

try the VGA out if u think the "more color" it produces is noticable, then go for VGA.

if they look the same in gaming, then def go for DVI no streakin
 
Originally posted by: knowley
so theres no hard and fast rule then? - I always thought DVI was the newer better technology, but theres less options on the monitor in DVI and the colours are less as well < so whats the point?
the DVI is newer, but VGA has been around for so long everyone has perfected it.

I personally always go for DVI for no streaking.

As for less options, DVI gets the true image from the computer, so you'll have to adjust on the computer's setting. Most settings you can change on VGA can also be done on control panel (those tools that come with Nvidia/ATI).

And the part about giving less color... i dunno about that one. havent read the review you linked to yet.
 
Aye, its a fecking gr8 monitor!!!!

I had an old Compaq tft5010 before, and I cant believe the colourful world I have been missing out on - and thats using DVI on this hitachi. I can actually see people in UT2003 where b4 they appeared to be too far away!

Well Toms gave it 3/5 for colour and stated this at the end:
"The score awarded for "Number of colors displayed" is the average for colors obtained in analog mode (4/5) and DVI mode (2/5). "

Cant find any 17" tft reviews on Anandtech?... have u read any others about this Hitachi?
 
Originally posted by: knowley
Hi people
I have got a new Hitachi 17" TFT.
Well Done. Welcome to slim line gaming ville! lol
So the question is, would you use DVI (no digi to analogue conversion) or VGA (higher refresh rate and more colours) for playing games????
Use the DVI> The colours are there, but the TFT will blend several other colours to make that colour, which isnt noticeable. At all.
Refresh rates do not matter on TFTs. Just make sure that you disable V-Sync in your video driver options, and you wont be limited to 60fps.
 
Can anyone explain why colors are typically worse over DVI? My understanding was that you would keep a more pure signal with digital, because you don't do an unnecessary conversion to analog, then back to digital.

Can anyone clear this up?
 
I to have a Hitachi 17" TFT but dunno what you mean by only 60hz in DVI mode. I?m running 75hz in DVI mode just fine right now? Also DVI looks MUCH better, its far clearer and sharper. I?ve had a friends over at the same time and we ran dual one on DVI the other VGA and we both agreed without doubt DVI is much nicer.
 
I don't think it matters whether you use 60 or 75hz. I have the VP171b. Before you install the Viewsonic driver, the monitor is shown as plug and play and offer both 60 or 75hz. After you install the driver, only 60hz is available. I could not notice any difference between the two as well.
 
I highly doubt that you are running above 60hz in DVI mode.
Whats probably happened is that you have set it to 75hz in the display propeties, but it is actually running at 60hz. My video card reports the refresh rate as 75hz, but the monitor says 60hz.
Either that or you are running off the DSUB connector.
It is impossible to run at above 60hz with DVI, at 1280x1024, cos it exceed the maximum bandwidth of the DVI interface.
 
It is impossible to run at above 60hz with DVI, at 1280x1024, cos it exceed the maximum bandwidth of the DVI interface.
when do we get a better DVI (DVI2?) The monster Vewsonic and IBM srceens that do 3840x2400 are rad but not using 2 DVI cables and having a slow refresh/redraw
 
america woke up then?...lol

Cool, so stick with DVI for sharper picture etc while gaming, and I prolly wont notice the difference in colours ne way!

Nice one thanks!
 
In our testing if you have a video card with good vga filters (ie any recent ati product) then there will be no difference in picture quality between vga and dvi. That is of course if the dvi of the lcd is up to par (which we have noticed a vast amount that aren't). Also most lcds actually look better when left at 60hz refresh rates since refresh isn't actually an issue with lcds.
 
then there will be no difference in picture quality between vga and dvi
??? I switch between DVI and VGA and have seen both on 3 other friends machines and video cards, VGA is a blurry mess compared to DVI
 
When switching between dvi and vga do you reset the lcd? I'm just giving my experience here at work. If the graphics cards vga filters are good and the dvi implementation on the lcd is optimal then we have noticed no difference in picture quality by blind test or using testing software. As usual everyone's mileage may vary, but this is what we've noticed.
 
can u plug a monitor into both VGA and DVI from the same Graphics card?... when I tried it and attempted to switch it beeped constantly!
 
Originally posted by: knowley
can u plug a monitor into both VGA and DVI from the same Graphics card?... when I tried it and attempted to switch it beeped constantly!
Didnt do that with my 9700pro. I had my Hitachi CML174SXW plugged into my 9700pro using both the DVI & DSUB connectors. It just acted as if there were two monitors were there. Of course, it only displayed the image from the connection that was enabled on the TFTs OSD.
 
Originally posted by: BoomAM
Originally posted by: knowley
can u plug a monitor into both VGA and DVI from the same Graphics card?... when I tried it and attempted to switch it beeped constantly!
Didnt do that with my 9700pro. I had my Hitachi CML174SXW plugged into my 9700pro using both the DVI & DSUB connectors. It just acted as if there were two monitors were there. Of course, it only displayed the image from the connection that was enabled on the TFTs OSD.

must be a problem with Geforce Ti4800 then? I had it running the DVI screen, used the monitors OSD to change to the VGA connection and the picture changed fine, but a constant beep came from the computer so I shut it down. wierd... nm cause I dont want to do that anyway.

With your CML174SXW do you find the DVIs brightness is already maxxed out?

Thanks
 
Originally posted by: knowley
With your CML174SXW do you find the DVIs brightness is already maxxed out?
If you mean the main TFT brightness, the one you adjust from the TFT itself, then yes, it was set to 100% brightness. Ive lowered it to 60% though, which is more than enough.

 
If all things were equal, I want to think that the DVI output would give more of the picture control to the video card (and its drivers) since the singals remain digital. My 1800FPs say "no adjustment necessary" when in DVI mode.

Unfortunately, one of my LCDs (all are 1800FPs) won't come out of DVI "power save" mode. Just happened over the last few days. 🙁

-SUO
 
there will be no difference in picture quality between vga and dvi

wow, this is not my experience and yes, we did to the auto setup when changing from dvi => vga
the differences between vga and dvi on the monitors that we checked showed from little/indiscernable differences to quite obvious differences, depending on monitor, with the dvi always better (or equal on the ones with small differences)
 
Back
Top