Reasons not to use a TV as a computer monitor?

boed

Senior member
Nov 19, 2009
539
14
81
Hello,


I know back when people used CRTs for computers there was a big reason not to use TVs as computer monitors. I'm curious if there are still reasons not to use LED TVs as monitors. I'd like to know if you have any reasons not to do so provided it has decent response time.



Thanks
 
Last edited:

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
Problems I've had using TVs before include:
  • Overscan which can't be turned off easily, or at all.
  • Input processing which adds multiple frames of lag that can't be turned off.
  • Processing that messes with the saturation or contrast that can't be turned off.
  • Inability to handle 4:4:4 subsampling.
  • Slow start up time because the TV runs a whole operating system that takes 30s to boot.
  • Some other problem that appears out of the blue because the TV has updated its firmware over the internet without asking.
There are TVs that don't have any of those problems and work fine as monitors, but they don't exactly put this stuff on the spec sheet, so you have to find it out yourself.
 
  • Like
Reactions: boed

kalrith

Diamond Member
Aug 22, 2005
6,628
7
81
I've been using a 32" 1080p TV as a computer monitor for years. I had to do a lot of research to find one that does 1:1 pixel mapping with low input lag. I had to put the TV in gaming mode and cut the sharpness to 0 to reduce the input lag. Here are the problems I could not eliminate.
  • No standby mode, so I have to turn on the TV every time instead of the computer waking it up.
  • Poor text display. Especially red text looks very bad on the TV
  • Low PPI. I don't notice this while gaming, but any other use is not crisp or sharp. This was not an issue when I sat 4' from the TV, but now I'm about 2 1/2', and it's much more noticeable (but much more immersive for gaming)
As Mr. Evil pointed out, there are new problems today, such as the lack of 4:4:4 chroma and smart TV features that weren't around when I was shopping for a monitor.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
For me input lag is too important.

Even the top tier TVs in terms of input lag tend to have between 1 and 1.5 frames of delay at 60hz. But often they achieve this without Chroma 4:4:4. Like on some displays you need Game mode for the lowest input lag but there's no 4:4:4 support, you need to go to a laggier PC mode.

It is very easy to find monitors with less than a half a frame of input lag.
 

Hinda65

Senior member
Jun 19, 2010
363
1
81
So i'm confused by some of the responses here....2 responses said no support for 4k60hz Chroma 4:4:4 and 1 says there is support if you don't use gamemode.... I was under the impression that with a "high speed" hdmi cable and an HDMI 2.0 TV you can have 4k60hz with 4:4:4 as long as the TV had an HDR setting?

I'm probably not understanding what's being said...Can someone clarify.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
Some 4K TV's can do 4:4:4 chroma subsampling @4k/60 but it tends to increase the input lag on TV's where monitors comply with full chroma while maintaining low input lag value. It's not dependent on HDR setting, you can have 4k/60Hz and true chroma subsampling and meet a bunch of other adobe specs without HDR compatibility.

As for HDR, I'm tempted to go see first JJ Abrams HDR movie just so I can sue him for blinding me with lens flare, so many shows will go overboard with the effects and wait until advertisers utilize it, burn the product image into your retina, it will be like the volume boost all over again.

HDR is nice but I wouldn't prioritize it until OLED is widespread and does 1000+ nits.
 
  • Like
Reactions: nathanddrews

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Another thing to consider - what is your target resolution? Depending upon your gaming rig, higher resolution can mean longer frame times, which can make all this talk about input lag moot. If you can't smoothly game at 4K60 now, then I wouldn't worry about whether the display is 4ms or 40ms. 1080p60 is pretty easy for a lot of graphics cards to hit with newer games. 1080p120 is harder. 4K60 is a lot harder.

I'm going to sound like a broken record (do people even know what that phrase means anymore?), but a great place to start your research is rtings. They have a somewhat exhaustive comparison of many HDTVs and how they perform specifically as a PC monitor, including input lag measurements at multiple resolutions and operational presets (Game Mode, etc.). Every individual review also a QA section where they answer specific questions from readers that also really help to clear up things not presented in the data collected.

Input lag tests:
http://www.rtings.com/tv/tests/inputs/input-lag

PC monitor ratings:
http://www.rtings.com/tv/reviews/by-usage/pc-monitor/best

Most of the latency is due to the internal SoCs they use for adapting input signals to the panel's native resolution and refresh. Saving a few pennies per set adds up to millions of dollars over time and with the focus being primarily movies and television shows, input lag is not a top priority. IMO, 4K TVs need another generation of hardware updates before they make good 4K PC gaming monitors. But then, GPUs need another generation of hardware updates before 4K60 gaming is a reality for many people.