Can't force 1920x1080 on Dynex 55"

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kyonu

Member
Dec 1, 2011
55
0
0
It would be somewhere in the picture settings area. Perhaps it has presets that have a "custom" option that enables more options.

Unfortunately not.

However, I got home and plugged in a VGA cable I had lying around, and instantly the picture went to 1920x1080 60Hz mode like in the nVidia setup and is working very smoothly and everything is very clear to see except links for some reason... You can definitely tell the signal degradation.

However, until ATI fixes their crap I'm kind of stuck with this...

Thank you all for your help.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
And being analog, there is some severe signal degredation.
...
All that said, TVs make extremely poor monitors anyway. The image quality is atrocious.

Now you've got me wondering, what should I look for on my VGA TV to see the atrocity? Are you talking about a pixel thing where I'll see it if I get my face close to the TV (I usually sit on the couch about 9 feet away, as I'm now typing this).

Or, is it something general that I'd see from a distance, like maybe the blacks are not as black as they could be? How does the VGA atrocity manifest itself?

When I use my DVI or HDMI monitors in another room, the image quality seems to be similar to the TV's VGA quality. Even when I put my face in the screen, I don't know what to look for and maybe then I'll see it if I know what to look for.

To reiterate, it's a 46" 1920x1080 TV (Samsung), connected via VGA cable.
 

kyonu

Member
Dec 1, 2011
55
0
0
Well i can attest to how VGA looks on a big TV now that I've been using it. There is a bit of blur and you can definitely not see high def -anything-. Watching movies with VGA is terrible and makes everything look generally less in quality.

I will be trying DVI to HDMI tomorrow to see the results.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Cool, I'm interested in your results. If there is something easy to point to, that lets me justify some potential upgrading to the financial department in our household ehheheh
 

Matt1970

Lifer
Mar 19, 2007
12,320
3
0
Some TV sets just don't look good using VGA. Some look excellent. If you are going to run from you PC to HDMI on your TV see if you can label the input "PC".If you TV supports it that will make a big difference.
 

mrpiggy

Member
Apr 19, 2012
196
12
81
Analog video has supported much higher resolutions than craptastic 1080P for decades. The problem is that most current TV/monitor manufacturers use garbage RAMDACs anymore since the big push is for DRM enabled HDMI digital. The other issue with TV's/monitor VGA is the made-of-toilet-paper-and-string VGA cables that are normally supplied. Analog signals over typical VGA cables "are" subject to interference/loss, however that is easily overcome with quality cables. For example, back when monitors were all CRT's, the high quality large monitors (that you'd more likely run at insanely high resolutions) came with big thick shielded VGA cables and large ferrite loops embedded. I'm looking at one supplied with a old NEC CRT monitor and it's at least twice as thick from shielding and much more well made as the thin junk that comes with current Dell LCD monitors, even though the length is the same.

If the picture on any TV/monitor sucks with VGA or other analog cabling, it has nothing to do with the "signal" type. It has everything to do with how well the manufacturer supports it with their penny pinching and kowtowing to the media industry. There is no "physical" reason a VGA analog signal doesn't look as good as a HDMI digital signal at 1080P resolution on your TV/monitor, however there are lots of manufacturing-cost-design financial reasons why your big 1080P TV/monitor looks worse with VGA analog.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
If the picture on any TV/monitor sucks with VGA or other analog cabling, it has nothing to do with the "signal" type. It has everything to do with how well the manufacturer supports it with their penny pinching and kowtowing to the media industry. There is no "physical" reason a VGA analog signal doesn't look as good as a HDMI digital signal at 1080P resolution on your TV/monitor, however there are lots of manufacturing-cost-design financial reasons why your big 1080P TV/monitor looks worse with VGA analog.


Well, you're partially correct. There is always signal degradation with an digital to analog conversion (or vice versa), and sending data to an LCD via analog introduces two conversions (digital to analog at the video card, and then analog back to digital for the display). This will, objectively, result in an image that is of less quality than an end to end digital connection in *all* cases. You are also correct in that proper cables can minimize the interference experienced along the way, but people rarely use proper cabling. Even with perfect transmission though, you have some degree of signal loss because no DAC is perfect, and using two is even worse.

We all used VGA with crappy cables for years, but most all of us were dealing with reasonably low resolutions, on really, really low quality monitors where it didn't make much difference to begin with.
 

mrpiggy

Member
Apr 19, 2012
196
12
81
Well, you're partially correct. There is always signal degradation with an digital to analog conversion (or vice versa), and sending data to an LCD via analog introduces two conversions (digital to analog at the video card, and then analog back to digital for the display). This will, objectively, result in an image that is of less quality than an end to end digital connection in *all* cases. You are also correct in that proper cables can minimize the interference experienced along the way, but people rarely use proper cabling. Even with perfect transmission though, you have some degree of signal loss because no DAC is perfect, and using two is even worse.

We all used VGA with crappy cables for years, but most all of us were dealing with reasonably low resolutions, on really, really low quality monitors where it didn't make much difference to begin with.

Well I agree you are right that there is always some analog to digital conversion loss, but at 1920x1080 resolution, with decent ramdacs and cabling, the picture quality should have no discernable difference in quality.
 

kyonu

Member
Dec 1, 2011
55
0
0
Well I agree you are right that there is always some analog to digital conversion loss, but at 1920x1080 resolution, with decent ramdacs and cabling, the picture quality should have no discernable difference in quality.

I only use quality cables, and the VGA cable I have is very shielded and pretty damn thick, and I still get a loss on Video. The loss looks just like the difference between DVD and Blu-ray movies.

As for the DVI cable I tried, it's just using the HDMI signal at 1080p and I get the same issue I originally had with input delays and it's very annoying. I'll just have to use VGA until I want to watch videos or something.
 

fuzzymath10

Senior member
Feb 17, 2010
520
2
81
I think lots of video cards are also to blame; reviewers used to evaluate 2D quality because the only output was analog and the quality of the output varied greatly. Using the same TV and same cheap/thin cable, my desktop's DVI and VGA outputs are essentially indistinguishable, while my laptop's VGA output is noticeably fuzzier with horizontal artifacts compared to its reference DVI output. This is all at native resolution.

Some LCDs decide to sync their VGA inputs at 75Hz; because DACs generally perform better with lower bandwidth, forcing this to 60Hz often helps, unless it's already at that level.

I put a brand new radeon 6570 in my parent's computer, and it loses sync (blinks off then on) with their analog-only LCD without careful tweaking, while the IGP and another older P4 desktop had no issues at all. A good analog setup is impossible to tell apart from DVI/HDMI, but many analog setups are bad.
 

Shaolinmonkk

Member
Apr 8, 2012
25
0
0
Be aware that many "1080P" HDTVs do not in-fact have a native resolution of 1080P and use image scaling, which significantly increases latency.

this is news to me, well how can you tell if a TV that is labeled 1920x1080p a true 1920 x1080p or not?


now im wondering if all of my 1920x1080p TV's are all actually true 1920x1080p. I sure hope so after all the money I spent on them.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
this is news to me, well how can you tell if a TV that is labeled 1920x1080p a true 1920 x1080p or not?


now im wondering if all of my 1920x1080p TV's are all actually true 1920x1080p. I sure hope so after all the money I spent on them.

Couldn't you right click on the desktop and see what the resolution is?

I think the statement only referred to a TV that represented itself to the computer (via the VGA connection) as having less than 1920x1080 resolution.

But if the TV tells the computer it's 1920x1080, and the computer sends a VGA signal that is 1920x1080, I don't see how upscaling fits into that situation, unless there is some kind of filtering or something going on.

I'm going to take some pictures to test this, comparing my DVI monitor to a VGA TV...
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
OK here are some sample pics I took, which one is the worst?

Sample+image+quality+VGA+vs+DVI.jpg
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
King Fatty, I can take 3 pictures of the same exact screen (without changing inputs) and get images as different as those 3.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Yes, it was very difficult getting the picture of the TV, and it looked much better to my eyeballs than to the camera. I think there was some moire pattern thing that confused the camera too, as the preview live picture on the camera looked better than the still photo after I snapped the image.

Either way, I think the TV did a good job of rendering the individual pixels, as you can see on the slanted part of the K or the Y, the TV was able to resolve the individual pixels. But the pixels were just so big, when you blow up the picture the white goes rainbow, and you should never go full rainbow.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I could but since Im not talking about a computer or a computer monitor I obviously cant.

Your TV might have a side-input, that might be convenient enough to plug in your laptop and then get properties on the TV. Otherwise, you may have to move furniture around to get behind the TV and plug the laptop in that way. My living room TV is connected to my laptop that way, using a port on the back of the TV for VGA.