Using an LCD TV as a Computer Monitor/Television

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 19, 2003
18,289
2
71
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through another DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

I am not 100% qualified to answer this, but I believe most do not have straight up DVI connections, but use HDMI, though HDMI is essentially DVI with Audio and I am not sure on the differences, but it appears their are.

In the AVS Forums, the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use, which did have overscan... Why? I don't know, I just know they have overscan.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
AA - You can perceive it how you like, or you can keep trying to defend/make excuses for yourself.

I informed the OP what his original choice would encompass: a modified BENQ monitor and the nuances of that set; mainly the deceptive 1920x1080 resolution of the ad.

SINCE, most future monitors would like to support 1080p due to the incoming advent of HD-DVD, BLURAY, and his usage mainly as a computer, and my own experiences with being an avid HTPC user, my suggestion for him was the 2 choices, for future-proof sake.

You instead, try to puff your chest as some "consultant", saying a lot of misinformation and assumptions which can grossly skew reasonable logic, which lead to my rebuttal was quantifiable numbers and first hand experiences thereof.

Anywho, I am really tired of your rambling and the tangent this thread has gone on.

To the OP - sorry about the side BS, anyhow these are typically the criterias you should look for in a good LCD TV:

- Size (the larger the better, with one caveat, see #2)
- Resolution (imperative when used as an HTPC / gaming system)
- Response time (for mininmal ghosting)
- Inputs (VGA, DMI, HDCP / HDMI, etc.)
- Contrast ratio (for brightness of panel)
- ATSC / input configurations (720p, 1080i, 1080p, etc.)

Good luck!
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: SynthDude2001
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..

Rear projection LCD? Huh? :p

The TV probably assumed it was connected via analog and tried to adjust it. Stoopahd.

Originally posted by: ArchAngel777
In the AVS Forums, the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use, which did have overscan... Why? I don't know, I just know they have overscan.

You're saying it does indeed happen with VGA or that it does indeed happen with DVI/HDMI or both? It makes perfect sense for it to happen with VGA (analog) but I believe there are no overscan problems with DVI/HDMI unless you were saying people reported it with DVI/HDMI in addition to VGA.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: SynthDude2001
Originally posted by: ST
2 - Again, you may want to look at my sig for my system specs, as it comprises of a 7800GTX also. Time and time again, benchmarks and reviews has shown that there is negligible differences between 6800u/7800gt/7800gtx until you start running at higher resolutions. As forum that stresses value, i dont think its in the best interest of people to waste additional $$ that is not well spent, but hey your life...have fun.

Could not the same thing be said for spending a lot of money on a 1080p display when a particular user doesn't need or want one...?

his first choice was seemingly a 1080p display fyi.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: SynthDude2001
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..

most consumer TV's don't have direct DVI to panel connection, but instead goes through internal deinterlacing / scaling circuitry then to the panel itself. You can see this first hand with the Sharp 45" LCDTV. ALthough it has a native 1920x1080 display panel, you cannot get better than 1376x768 pixel mapping from any source (i.e., non 1080p), unless you bypass the external AVS box and connect directly via DVI. The other variation of this model does not have an external box, thus you cannot "bypass" this limitation.

 
Mar 19, 2003
18,289
2
71
Originally posted by: ST
Originally posted by: SynthDude2001
Originally posted by: ST
2 - Again, you may want to look at my sig for my system specs, as it comprises of a 7800GTX also. Time and time again, benchmarks and reviews has shown that there is negligible differences between 6800u/7800gt/7800gtx until you start running at higher resolutions. As forum that stresses value, i dont think its in the best interest of people to waste additional $$ that is not well spent, but hey your life...have fun.

Could not the same thing be said for spending a lot of money on a 1080p display when a particular user doesn't need or want one...?

his first choice was seemingly a 1080p display fyi.

I realize that. ;)

I'm just saying, I don't think it's necessarily consistent to knock ArchAngel's "extravagant" choice of a 7800GTX at the same time you're trying to convince him that he's "missing out on so much" by not having a 1080p display. Sure, objectively the 1080p display is better - I don't think anyone is disputing that. But with his reasoning that he wants to keep the card for a long time, the combination (his card and display) seems logical to me...
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: xtknight
Originally posted by: SynthDude2001
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..

Rear projection LCD? Huh? :p

The TV probably assumed it was connected via analog and tried to adjust it. Stoopahd.

Originally posted by: ArchAngel777
In the AVS Forums, the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use, which did have overscan... Why? I don't know, I just know they have overscan.

You're saying it does indeed happen with VGA or that it does indeed happen with DVI/HDMI or both? It makes perfect sense for it to happen with VGA (analog) but I believe there are no overscan problems with DVI/HDMI unless you were saying people reported it with DVI/HDMI in addition to VGA.

Sorry, I wasn't clear with that... Trying to do three things at once. I meant, that nearly all consumer TV's do not have a DVI port. Most have a VGA port and sometimes an HDMI port.

DVI does not introduce overscan. HDMI does, as well as VGA. I am not sure WHY HDMI overscans for users, since it is supposed to be DVI... That is why I am confused. As for VGA? I know why that does overscan.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: ArchAngel777
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through another DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

I am not 100% qualified to answer this, but I believe most do not have straight up DVI connections, but use HDMI, though HDMI is essentially DVI with Audio and I am not sure on the differences, but it appears their are.

In the AVS Forums, the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use, which did have overscan... Why? I don't know, I just know they have overscan.


Please stop spreading MISINFORMATION!

The Westinghouse 37", which I bought and also posted on the AVS forums, ALSO has a prefered DVI input for PC use and works great with minimal overscan (4% quantified).

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Oh I see. HDMI and overscan? Yeah that's odd.

ST: Why should a deinterlacer and scaler introduce overscan? Is the video data not digital all the way through that process?
 
Mar 19, 2003
18,289
2
71
Originally posted by: ST
Originally posted by: SynthDude2001
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..

most consumer TV's don't have direct DVI to panel connection, but instead goes through internal deinterlacing / scaling circuitry then to the panel itself. You can see this first hand with the Sharp 45" LCDTV. ALthough it has a native 1920x1080 display panel, you cannot get better than 1376x768 pixel mapping from any source (i.e., non 1080p), unless you bypass the external AVS box and connect directly via DVI. The other variation of this model does not have an external box, thus you cannot "bypass" this limitation.

That makes sense and seems to agree with what I've read before.. In any case, it wasn't my HDTV (that I used at the time) so I haven't bothered to read up a whole lot on it. :p

Was just kind of surprised that there was an issue with overscan with DVI on a supposedly digital display.


Originally posted by: xtknight
Originally posted by: SynthDude2001
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

To be honest I don't know exactly how it works, but when I've connected my video card to a LCD rear projection HDTV before (over DVI), there was indeed some overscan. Luckily the Nvidia driver is able to compensate for it, but the actual "displayed" resolution at 720p was something like 11xx*6xx, IIRC. Like I said, I don't know how overscan works or even why it still exists in modern TV's, but some TV's definitely still have an issue with it..

Rear projection LCD? Huh? :p

The TV probably assumed it was connected via analog and tried to adjust it. Stoopahd.

Am I missing something? :confused: Like I said, it wasn't my TV, but I thought it was an LCD rear projection... it definitely wasn't a direct vew LCD panel...
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Don't "direct-view" and "rear-projection" only refer to CRTs or have I had a bit too much virtual crack today?
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: SynthDude2001

his first choice was seemingly a 1080p display fyi.

I realize that. ;)

I'm just saying, I don't think it's necessarily consistent to knock ArchAngel's "extravagant" choice of a 7800GTX at the same time you're trying to convince him that he's "missing out on so much" by not having a 1080p display. Sure, objectively the 1080p display is better - I don't think anyone is disputing that. But with his reasoning that he wants to keep the card for a long time, the combination (his card and display) seems logical to me...[/quote]

well, 2 things: 1) if the original user's intention was for that particular LCDTV set, he must have been drawn by the high native 1920x1080 resolution (i know i would), otherwise there are many other cheaper alternatives. 2) AA's insinuation that he doesn't need to spend to upgrade every year, albeit he has the latest 7800gtx, albeit on a low res display, which is perplexing. those resolutions a more value added solution (X800GTO2 / 6600GT) would suffice.
 
Mar 19, 2003
18,289
2
71
Originally posted by: xtknight
Don't "direct-view" and "rear-projection" only refer to CRTs or have I had a bit too much virtual crack today?

I may be misusing terms (and I apologize if that's the case), but I'm 99% certain that there exist LCD rear projection TV's too (and that the one I used was one of them)... By "direct view" I just meant a regular LCD panel, although now that you mention it, I think I have only really heard that particular term used to describe CRT's.

Edit: I believe the TV was this one or one very similar to it.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ST
Originally posted by: ArchAngel777
Originally posted by: xtknight
Not sure how overscan would be introduced in to a DVI connection. The GPU's output is digital (not like it goes through another DAC->ADC) and goes straight to the DVI port and straight to your LCD TV's digital input and is displayed as a perfect matrix of pixels. I say definitely connect it via DVI.

I am not 100% qualified to answer this, but I believe most do not have straight up DVI connections, but use HDMI, though HDMI is essentially DVI with Audio and I am not sure on the differences, but it appears their are.

In the AVS Forums, the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use, which did have overscan... Why? I don't know, I just know they have overscan.


Please stop spreading MISINFORMATION!

The Westinghouse 37", which I bought and also posted on the AVS forums, ALSO has a prefered DVI input for PC use and works great with minimal overscan (4% quantified).

If you scroll up, I had said that most consumer level TV's have between 5 and 15 percent overscan. I guess in this case, I was 1% off. Wow, you sure showed me...

Anyway, asside from that, overscan is overscan... I am not sure how I am spreading misinformation when you admit that the DVI port for the Westinghouse is around 4% overscan.

But anyway, keep trying to attack my statements.

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: xtknight
Oh I see. HDMI and overscan? Yeah that's odd.

ST: Why should a deinterlacer and scaler introduce overscan? Is the video data not digital all the way through that process?

it shouldn't...but again depending on implementation, there may be some AD / DA conversions that we're not aware of.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
I doubt DVI goes through deinterlacing...rather only the output from the 8VSB/8PSK/QAM/whatever demodulator.

My external HDTV tuner inputs 1080i and outputs DVI, and it's already scaled and deinterlaced to fit my 1280x1024 desktop LCD just fine. I'm sure my desktop LCD has no deinterlacing capability in it (well pretty sure? it's not marketed as "LCD TV/multimedia LCD"). I have no overscan/side margin issues at all with it. Zilch.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: ArchAngel777
If you scroll up, I had said that most consumer level TV's have between 5 and 15 percent overscan. I guess in this case, I was 1% off. Wow, you sure showed me...

Anyway, asside from that, overscan is overscan... I am not sure how I am spreading misinformation when you admit that the DVI port for the Westinghouse is around 4% overscan.

But anyway, keep trying to attack my statements.

"the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use"

:rolleyes
 
Mar 19, 2003
18,289
2
71
Originally posted by: xtknight
I doubt DVI goes through deinterlacing...rather only the output from the 8VSB/8PSK/QAM/whatever demodulator.

My external HDTV tuner inputs 1080i and outputs DVI, and it's already scaled and deinterlaced to fit my 1280x1024 desktop LCD just fine. I'm sure my desktop LCD has no deinterlacing capability in it. I have no overscan/side margin issues at all with it. Zilch.

You know...we're getting way off track here, but I've always wondered about something. My HDTV tuner (PCI) also can output DVI (separately from the video card), at a maximum resolution of 1080i. I can connect this to my 2005FPW just fine, and the signal is displayed by my monitor OSD as "1920x1080 30Hz" (usually - sometimes it displays weird numbers for some resaon). In any case, in Full scaling mode, the signal always looks right... and looks noticeably better than 720p sometimes. If my tuner isn't deinterlacing the signal (since I don't think it's outputting a 1080p signal), then is my LCD? :confused:
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ST
Originally posted by: SynthDude2001

his first choice was seemingly a 1080p display fyi.

I realize that. ;)

I'm just saying, I don't think it's necessarily consistent to knock ArchAngel's "extravagant" choice of a 7800GTX at the same time you're trying to convince him that he's "missing out on so much" by not having a 1080p display. Sure, objectively the 1080p display is better - I don't think anyone is disputing that. But with his reasoning that he wants to keep the card for a long time, the combination (his card and display) seems logical to me...

well, 2 things: 1) if the original user's intention was for that particular LCDTV set, he must have been drawn by the high native 1920x1080 resolution (i know i would), otherwise there are many other cheaper alternatives. 2) AA's insinuation that he doesn't need to spend to upgrade every year, albeit he has the latest 7800gtx, albeit on a low res display, which is perplexing. those resolutions a more value added solution (X800GTO2 / 6600GT) would suffice.[/quote]

You still don't get the point. But let me say this, a 6600 GT won't last three years for games PERIOD. You are talking about the NOW, I am talking about the FUTURE... I shouldn't even have to explain this, it should be common sense.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ST
Originally posted by: ArchAngel777
If you scroll up, I had said that most consumer level TV's have between 5 and 15 percent overscan. I guess in this case, I was 1% off. Wow, you sure showed me...

Anyway, asside from that, overscan is overscan... I am not sure how I am spreading misinformation when you admit that the DVI port for the Westinghouse is around 4% overscan.

But anyway, keep trying to attack my statements.

"the Westinghouse that people were raving about (I never tested it, I am sure it is decent model, if the say it is) had only a VGA port for PC use"

:rolleyes

Wow, you really are out for blood aren't you? Do you think it is possible that Westinghouse in some attempt to keep up with technology and sales intruduced new models with new features?!?! Say it isn't so!

 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: xtknight
I doubt DVI goes through deinterlacing...rather only the output from the 8VSB/8PSK/QAM/whatever demodulator.

My external HDTV tuner inputs 1080i and outputs DVI, and it's already scaled and deinterlaced to fit my 1280x1024 desktop LCD just fine. I'm sure my desktop LCD has no deinterlacing capability in it (well pretty sure? it's not marketed as "LCD TV"). I have no overscan/side margin issues at all with it. Zilch.

Actually most TV/Monitor's do go through interlacing, because the source is unpredictable (480i, 480p, 720p, 1080i). In the case of your hdtv tuner box, you're probably correct, it may be internally deinterlaced and outputs at 720p (most likely). it will have scaling functionality to map out other resolutions to your 1280x1024 panel resolution though.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: ArchAngel777
You still don't get the point. But let me say this, a 6600 GT won't last three years for games PERIOD. You are talking about the NOW, I am talking about the FUTURE... I shouldn't even have to explain this, it should be common sense.

lol, yet you make a suggestion for a 1280x768 display

go go bluray/hdtv....not

/golfclap

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: SynthDude2001
Originally posted by: xtknight
I doubt DVI goes through deinterlacing...rather only the output from the 8VSB/8PSK/QAM/whatever demodulator.

My external HDTV tuner inputs 1080i and outputs DVI, and it's already scaled and deinterlaced to fit my 1280x1024 desktop LCD just fine. I'm sure my desktop LCD has no deinterlacing capability in it. I have no overscan/side margin issues at all with it. Zilch.

You know...we're getting way off track here, but I've always wondered about something. My HDTV tuner (PCI) also can output DVI (separately from the video card), at a maximum resolution of 1080i. I can connect this to my 2005FPW just fine, and the signal is displayed by my monitor OSD as "1920x1080 30Hz" (usually - sometimes it displays weird numbers for some resaon). In any case, in Full scaling mode, the signal always looks right... and looks noticeably better than 720p sometimes. If my tuner isn't deinterlacing the signal (since I don't think it's outputting a 1080p signal), then is my LCD? :confused:

My LCD also shows 1080i (with no mention of refresh rate) in the OSD. Confusing.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: ArchAngel777 Wow, you really are out for blood aren't you? Do you think it is possible that Westinghouse in some attempt to keep up with technology and sales intruduced new models with new features?!?! Say it isn't so!

have you taken your medicine today?

THE ONLY WESTINGHOUSE that AVS forums raves about is the 37" 1920x1080p monitor as seen at http://www.avsforum.com/avs-vb/showthread.php?t=531808&page=53&pp=30 in all its glorious 53 pages. Or would you care to enlighten us on this "other" one that only has a VGA port you are insinuating?