Why are regular TV's so vibrant?

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
On my crap fishbowl $300 32" panasonic el cheapo 640x480 interlaced low definition TV, the vibrance makes videos look much better than my $700 21" IBM super high definition 2048x1536 super hi contrast trinitron tube.

Everything just looks washed out on the trinitron, while on the TV, everything is so much more vibrant.

Anyone else feel the same?

Why is that kind of vibrance exclusive to TVs only? It makes videos look so much more real than the washed out crap on monitors.

Now I just need one of those 42" plasma screens so I can enjoy both worlds, super high definition, and super high vibrance. :thumbsup:

/end rant
 

Ages120

Senior member
May 28, 2004
218
0
0
Nvidia card have an option in the driver for digital vibrance. Adds tons to colors. I enjoyed my gf3 because of that one feature more then my 9500 pro.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Have you tried altering brightness, contrast and saturation? A trinitron should be very vibrant.
 

DrMindbender

Member
May 26, 2004
143
0
0
Yeah, I noticed that DVD's looked washed out. But I tinkered with my copy of WinDVD and discovered the color options and chose something like color level or saturation and kicked it up until everyone looked like they were blushing and then leveled it off until everything looked natural. It really makes movies shine almost bettter than t.v or just like the movie theater. A side note if you experience dark picture quality especially on movies like The Matrix, up the gamma and avoid messing with contrast and brightness.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
Can't alter the settings for DVI files :(
Digital vibrance is for saturation, that's not really the problem, it's more the intensity or the gamma/contrast of the video.
 

DrMindbender

Member
May 26, 2004
143
0
0
I'm not sure how exactly digital vibrance works, but it seems to just make colors brighter by being fluourescent. Saturation increases the intensity of the Reds Greens and Blues in the picture.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
I just increased the contrast to 115% for the overlay... looks a lot better...
Now if only it would display an image as crisp as the TV..
I tried switching my monitor to 640x480 interlaced 43Hz refresh, still not as crisp as a tv, in fect worst than it was at when I had it set at 2048x1536 @ 75Hz
Maybe it's because I sit so close to the monitor
 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
It's because your TV runs everything through all sorts of "improvement" processes to add colour and whatever else... Go to an electronics retailer's webpage and look at all the technocrap they list for a regular ol' CRT tube television. That's why it looks better.

Your monitor is displaying a more "pure" version, which shows the huge imperfections of SD video. Of course, you can tweak that, but you have to do it yourself, since there's no fancy filters built into your monitor to do it for you.
 

CU

Platinum Member
Aug 14, 2000
2,415
51
91
My phillips monitor came with a program that would do something to any window you told it to. It was suppose to used on movie players and image viewers to make them look like they would if they were viewed on a tv. I don't remember the name of the program.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
Originally posted by: Nebor
It's because your TV runs everything through all sorts of "improvement" processes to add colour and whatever else... Go to an electronics retailer's webpage and look at all the technocrap they list for a regular ol' CRT tube television. That's why it looks better.

Your monitor is displaying a more "pure" version, which shows the huge imperfections of SD video. Of course, you can tweak that, but you have to do it yourself, since there's no fancy filters built into your monitor to do it for you.

I think the latest graphic cards do include many filters and decoders and what not. During my transition from my 9800pro to my FX5900, I had to use an old PCI Matrox Millienium II that had none of these filters and decoders, and everything looked 100x worse, all pixelated.
 

monzie

Senior member
Oct 28, 2003
247
0
0
Simple....... TV's run a different colour space to that of pc's.

Pc's are designed for ACCURATE (ish) WYSIWYG.........but are poor at true white...it would hurt your eyes being so close to the screen.....TV's 'radiate' white much more which in effect super saturates the colours giving vibrance,plus the circuitry inside modern TV's allows the TV to super define areas of different shades (contrast) making for a sharper and more vibrant picture from a low res screen..........basically TV's fool the eye into making you believe an image on a TV is sharper than it really is, which is another reason why they can sometimes look totally NAFF on a pc monitor.

DVD's etc always look better on a TV as this is what they are designed for (in resolution AND clour space).

Oh and I forgot to mention that...

pc's DO NOT display TV resolutions when played FULL screen so pixels are not accurately defined.

pc's do not understand interlacing....its usually software 'emulating' interlacing on progresive scan hardware (monitors).

Interlacing (on a TV) is excellent at smoothing out video footage (free Anti Aliasing if you like..anyone whos played a pc game via their TV out will know what I mean).

The human eye is only good at detail CLOSE UP............it drops of rapidly at distances of above (approx) 2 feet.

Some monitor manu's (namely Phillips, Mitsubushi and Iyama) have models that have a special video mode which enables TV like circuitry to make video/photo's look more vibrant ... but as I said above its not to be used for normal pc work....unless you like headaches/eye strain/blindness.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
I got it to look like a TV...
I installed older drivers, the 52.16 gave me more control over video
Increased the gamma to 1.4 and the contrast to 140%, now I get the super white vibrant I see in TVs... still not as sharp even at 640x480, but it looks great..
Oh and for some reason, nvidia drivers would only let me change the gamma if I use the overlay clone video to my second monitor. The original video won't change with my settings.

BTW DVDs are non-interlaced, and it still doesn't look too good on monitors, considering it's super clear even on a 60" widescreen HDTV at 720p
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Trinitrons do have a poor color gamut - that's what they are about. They are made to be all about contrast and sharpness. TV set tubes are about blurriness and a rather flat color gamut.

Watching TV on Trinitrons always looks awful unless you tweak the gamma and everything a lot. You did that already. And yes, the TV signal, at a bandwidth of about 5 MHz and an effective resolution of about 360x480, is inherently blurry. This is getting more obvious the better the display device is.
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
Do you mean 640x480? 360x480 would be a tall box, the aspect ratio wouldn't be right.

As I said, I switched my monitor to 640x480 interlaced, and it still doesn't look as sharp as a TV.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
BTW DVDs are non-interlaced, and it still doesn't look too good on monitors, considering it's super clear even on a 60"

I think you'll find that the vast majority of DVD content is interlaced.