HDMI - Unable to get 1920 x 1200 ??

ChinaCat

Member
Jul 14, 2002
55
0
0
I use a Samsung 245T.

I was using a 9800 GTX Video card with DVI interface.

I just upgraded to the EVGA 470 GTX and since it comes with a HDMI interface and my monitor accepts it, thought that would be a higher quality conneciton.

With the HDMI interface, the maximum resolution I can obtain is 1920 x 1080. Why?

I have to lower it to 1768 x 992 to get the same widescreen size at before.

Am I better off using DVI and getting my 1920 x 1200 resolution back, which is the resolution I prefer?

Thank you kindly and please advise what you would do.

Cheers -CC
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
To display higher than 1920x1080 resolution via HDMI, you'll need to have HDMI 1.4 compliant connections and cable.

AFAIK, the GTX series behave as HDMI 1.3 until software update (which could already be out now).
 

ChinaCat

Member
Jul 14, 2002
55
0
0
Thanks KIAman.

Well I'm using the most recent 470 GTX drivers. Perhaps the issue is the HDMI connection on my monitor. It is 18 months old so perhaps it isn't the version HDMI necessary for a higher resolution.

Any way if this is the max I can get on my setup, I'm now back with DVI and very happy to have 1920 x 1200 back.

Cheers -CC
 

zagood

Diamond Member
Mar 28, 2005
4,102
0
71
Try to tweak the nVidia control panel - under Change Resolution, hit Customize, then check "Enable resolutions not exposed by the display."

HDMI 1.0 and above support 1900 x 1200 (WUXGA).

If DVI is working for you though...why?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I have the same problem with a Westinghouse 26" (1920x1200) and my 5870. I'm using a VGA cable now. Searching didn't find much in the way of help other than people had this issue with my monitor. Let me know what you find, I'm always hoping something comes along that might fix this. :)
 

Binky

Diamond Member
Oct 9, 1999
4,046
4
81
DVI is basically the same as HDMI if you don't need audio to be carried by the same cable. Don't make a lot of work out of it if it isnt necessary - just use DVI, or a DVI-HDMI converter.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I thought 1920x1080 was HD, therefore HDMI would go up to......1080, just a thought!
 

LokutusofBorg

Golden Member
Mar 20, 2001
1,065
0
76
I believe the video signal in HDMI is identical to a DVI signal, and cards that have HDMI out basically just package the DVI signal into the HDMI port (those that also have audio out over the HDMI are doing a bit more, obviously). No reason to try to use HDMI over DVI, they're the same.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
yes because HD can not output 19200x1200 ,, Only a your HD MI res would be 1920x1080

1080i broadcasts ...


thx

Before anyone actually follows Tweakboy's rambling, the above is (as always) wrong.

-ViRGE
 
Last edited by a moderator:

wrangler

Senior member
Nov 13, 1999
539
0
71
This is interesting because I have encountered the same issue with a 5870 and my 1920x1200 monitor.

Any time I use the HDMI cable, 1920x1080 is the max resolution I can select. Custom resolution doesn't work.

I just went back to DVI as well and no problems.
 

ChinaCat

Member
Jul 14, 2002
55
0
0
Thank you all very much.

Yes, back to DVI at 1920 x 1200 since I don't need sound over it. I have a surround system plugged in to my Fatality Champion card.

Seriously, this community rocks.

Cheers -CC
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
You can do DVI > HDMI until you move to the 2,000+ range on horizontal resolution. At that point DVI requires a double link connection and the simple converters people use now will not work. HDMI only has single link capability as far as DVI is concerned. Single link is 3 pairs of wires, RGB data. Dual link uses 6 pairs and the HDMI spec doesn't support that so you would need to convert those 6 pairs on DVI to 3 pairs on HDMI but reclocking the signal and increasing the rate.

Its a moot point though as PC is going display port and consumer devices are dropping HDMI in favor of cat5 jacks and wiring with a serial protocol vs parallel like HDMI
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Actually it's really easy, a single DVI link is able to saturate 2.75 megapixels at 60hz, which means if we want 16:10 ratio the maximum supported resolution is
x = 16/10y
x * y = 2.75 * 10^6 = 1.6y^2 = 2.75 * 10^6
=> y = 1311; x = 2098

Really not that hard, isn't it? ;) Though you shouldn't forget the length of the cable, but 1920x1200 really shouldn't be a problem with standard single DVI.