Can't force 1920x1080 on Dynex 55"

kyonu

Member
Dec 1, 2011
55
0
0
I posted this over in the TV section but was referred to this section (post is http://forums.anandtech.com/showthread.php?t=2239542)

Okay, so I have a new Dynex 55" TV that seems to have a pretty crappy response time, but when I plug it into my computer via HDMI my mouse gets a delay that's causing all sorts of problems, such as it is not where I want it to be when I move my hand there.

I've been looking around other forums and people have been able to force 1920x1080 59Hz (60Hz mode) Progressive in the NVIDIA Control Panel and the TV performs just like a monitor... But I can't for the life of me figure this out in the ATI CCC.

Does anyone know how I can force a resolution of 1920x1080 59Hz (60Hz)? The "Custom" resolutions are a joke since we can't specify a Hertz number, and everytime I try changing the hertz in the Desktop Properties section of CCC, it just reverts back to 60Hz (TV supports 60, 59, 50 hz). In the Windows 7 Management I can change it to 59Hz, but it doesn't modify anything.

I have an ASUS G73 laptop with an nVidia GTX460 in it and I was able to reproduce the 1920x1080 60Hz trick very quickly and it was very effective--just made a custom resolution of 1920x1080 at 60Hz and it let me select it. ATI removes the selection since it is already there.

My desktop video card is the Radeon HD5770 1GB.

Someone PLEASE help me find out how to do this...

Also I have contacted Dynex support... And they were useless as expected from a Best Buy brand.
 

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,766
981
126
That's why i have to use the VGA connector on my 32" Vizio tv. Through HDMI the fonts and text looks bad and i have more lag in games for some reason.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
Use VGA, seriously. You won't notice a difference in quality and if you do, it's placebo.

HDMI would maybe give you better blacks but on an LCD, it's already meh.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Did you try other hdmi ports on the TV to see if it helps?

I have no issues with my 32" Vizio using HDMI with a HD 5850
 

superccs

Senior member
Dec 29, 2004
999
0
0
Try a driver sweep with driver sweeper, could be a software glitch. Also makes sure the TV is set to full resolution and isnt in another view mode.
 

kyonu

Member
Dec 1, 2011
55
0
0
Did you try other hdmi ports on the TV to see if it helps?

I have no issues with my 32" Vizio using HDMI with a HD 5850

Indeed, tried 'em all.


As for the VGA suggestions--that's ludicrous. I bought this TV for a reason; 1080p. If nVidia can do it but ATI can do it, it's definitely a software problem, I just can't figure out how to force ATI to put the TV into 1920x1080 60Hz mode.

I wonder... Does anyone know why CCC won't let me make a custom resolution of a resolution that's already there? It just removes it once I apply the resolution. I can also not change the Hertz for some dumb reason. :(
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
There's a custom resolution section in CCC. In the left panel under "my Digital Flat-Panels" select "HDTV Support" and you can make any resolution you want.

Be aware that many "1080P" HDTVs do not in-fact have a native resolution of 1080P and use image scaling, which significantly increases latency. I have a small "720P" Philips panel connected to my compy as a second screen but I run it through VGA because it ALWAYS applies image scaling when you go through HDMI, there's no way to disable it without some manufacturer's tools. Its native resolution is actually 1440x900, but that's not a supported resolution at all over HDMI, the screen just turns black.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Have you tried making a custom monitor definition through EDID override and using that instead?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I'd second Ferzerp's strategy, try the EDID route. That will force the AMD driver to accept the new resolution, as the setting is pushed to the driver instead of the other way around.

I've used it on various 58XX series AMD cards to make a custom resolution for a monitor over DVI, and it worked to enable the new resolution. I'd expect it would also work for a TV over HDMI.
 

kyonu

Member
Dec 1, 2011
55
0
0
There's a custom resolution section in CCC. In the left panel under "my Digital Flat-Panels" select "HDTV Support" and you can make any resolution you want.

Be aware that many "1080P" HDTVs do not in-fact have a native resolution of 1080P and use image scaling, which significantly increases latency. I have a small "720P" Philips panel connected to my compy as a second screen but I run it through VGA because it ALWAYS applies image scaling when you go through HDMI, there's no way to disable it without some manufacturer's tools. Its native resolution is actually 1440x900, but that's not a supported resolution at all over HDMI, the screen just turns black.

That was where I was going to set the custom resolution, but as I mentioned it won't stick... For some reason it shows as "Custom" in the Resolution selection screen in the same window, but disappears once I apply it and defaults to the standard 1080p resolution.

Also I was able to do this with the nVidia card just fine, so how would I find out if it is Image scaling or not? The TV Menu's don't have a way to scale (mode is set to WIDE, and other options stretch it off the screen).

Ferzerp said:
Have you tried making a custom monitor definition through EDID override and using that instead?

Where do I find the EDID overrides? I believe I was able to select the NTSC value of 1920x1080 60Hz NTSC from the same window where I created Custom resolutions, but frankly I have no idea what that did.

It tells me that it adds it to the "Displays Manager" under the "Force" button, but I can't for the life of me find a Force button anywhere in CCC. Where is the Displays Manager? Is it talking about the Windows resolution selection?
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
EDID override is a custom .inf file for your display that lets you manually configure resolutions and timings to account for overscan/underscan, panels that don't report all they are capable of displaying, etc.

I think there are some apps capable of making the infs for you. Manual would be rather difficult. I haven't touched the concept in years though.

edit: powerstrip can make a custom inf for a monitor for example. I remember there being a different app called something similar to edid override that would give you fine control over all the options though. Anyone else know what I'm talking about?
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Right click desktop > screen resolution > advanced settings > [pop up box] monitor > changing resolution doesn't work/stick?

If so I'd try the following:

Open command prompt as admin

Paste the commands or type them in and hit enter after each one

set devmgr_show_nonpresent_devices=1

start devmgmt.msc

Once device manager opens up make sure you select the following

Show Hidden Devices

Take a look at your display adapters and see what is listed. You can delete the greyed out ones if they are present.

Look under the monitor section and see what is there also. You can delete all of them as windows will find them again.
 

kyonu

Member
Dec 1, 2011
55
0
0
@ Ferzerp: I can try that, I have experience within the formatting of INF files, so hopefully I can figure it out.

@ Kenmitch: Good suggestion, didn't think about that... But then again this computer has only ever had my HD 5770 in it and the motherboard has no on-board video... Are we looking specifically for other video adapters?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
@ Kenmitch: Good suggestion, didn't think about that... But then again this computer has only ever had my HD 5770 in it and the motherboard has no on-board video... Are we looking specifically for other video adapters?

Yep....And what you have listed under monitors also.

Sometimes video cards will be doubled or tripled and same goes for monitors. Delete xtras if found. Maybe delete all under monitors and see if helps.
 

djnsmith7

Platinum Member
Apr 13, 2004
2,612
1
0
Text looks terrible (mainly in browsers) using HDMI on my 58" Samsung plasma display, but looks outstanding when I use VGA. My problem was solved & I run everything at 1080p.
 

kyonu

Member
Dec 1, 2011
55
0
0
Text looks terrible (mainly in browsers) using HDMI on my 58" Samsung plasma display, but looks outstanding when I use VGA. My problem was solved & I run everything at 1080p.

Wait, VGA actually goes to 1080p? I thought VGA was analog only (IE Interlaced only)?

@Kenmitch: great, thanks. I'll try that out once I get home.
 

talion83

Member
Mar 21, 2011
67
0
61
Analog connections (even component) can handle 1080p. The reason most devices don't support is because those connections don't support any sort of DRM.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
You can open the existing INF file, edit it to change the setting to what you desire, then save it. That should be much easier than creating a new INF from scratch. One of those free EDID utilities should be able to extract the existing INF for you to edit.

However, the easiest solution seems to be switching to VGA - I bet even a picky person would have a hard time telling the difference between VGA and HDMI if they both are outputting 1080p to the same monitor, so perhaps your aversion to VGA is out of proportion to what you'd actual get. I currently use VGA on a 46" 1080p display, and it works great and the pixels look fine to me - I don't see any blurry pixels or anything, not sure how using HDMI would be any different on a pixel level, assuming the monitor is set up properly to where you aren't scrolling the picture off the screen or other silliness that is not a limitation of the VGA protocol specifically (i.e., it seems overly harsh to blame the VGA connection for issues that may be caused by a failure to adjust the TV settings to make sure you see all the pixels etc.).
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
And being analog, there is some severe signal degredation.

The problem with a lot of TVs and HDMI is they don't display a pixel by pixel representation. They try to complensate for overscan by shrinking the image which garbles text. Many TVs have a way to shut this off. "PC mode" or some other setting. It's different per manufacturer and even per TV. What the people telling you VGA looks better are doing is just switching to a mode that the TV doesn't do its TV stuff to when it displays it. If you can use HDMI or DVI, though, and shut that stuff off, it will look better by far than using an analog connection will. If you're hell bent upon using VGA, make sure to buy a higher quality cable as well, but even then, you may be out of luck based on what your vide card vendor spent on analog components of the card. Barely anyone uses it anymore, so they aren't the highest quality signals to begin with.


All that said, TVs make extremely poor monitors anyway. The image quality is atrocious.
 

kyonu

Member
Dec 1, 2011
55
0
0
so perhaps your aversion to VGA is out of proportion to what you'd actual get.

Nah, my aversion has always been me thinking it doesn't support 1080p.

Does VGA support a full 60Hz though?

Also, my screen using HDMI does not look garbled or in anyway scrambled--I can read text just fine. Should this change when using VGA?

Lastly, I will try the EDID modification to make sure it has the right information in there. Thanks again for informing me of that option.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Also, if your monitor has a "game mode" make sure to turn that on too. And turn off everything in the "advanced picture" settings menu
 
Last edited:

kyonu

Member
Dec 1, 2011
55
0
0
Also, if your monitor has a "game mode" make sure to turn that on too.

Guess I should mention this... It doesn't have "Modes" per say, but it has an option in the menu called "DPMS" which the manual says is required for computer connections, and that is indeed on.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
It would be somewhere in the picture settings area. Perhaps it has presets that have a "custom" option that enables more options.