1080P Signal Into 4k Video Card Into 4k TV Question

muskyx1

Member
Apr 20, 2005
151
1
81
Seems there's a heated debate at another PC/Electronics Forum and I thought I'd call upon the experts here.

What would be the difference,if any, between these 2 scenarios.

Blu-ray Disc played by a BR player feeding a 1080P signal into a 4k TV

vs

The same Blu-Ray Disc played by a high end HTPC with a 4k video card feeding a 4k video signal into the same 4k TV.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
I'd say the HTPC based on previous similar scenarios of new video tech. The HTPC would need configuration.

Ill piggy back here with related question. In the OP scenario how do you get audio to a receiver? I'd think that DP out needs to carry the 4k signal. Can the HDMI out carry the audio to the receiver without holding video information?

I'm considering going 4k tv but don't want to have to buy a new receiver and I'd like the video card to handle sound. Current receiver supports HDMI 1.4 and 1080p pass through. If possible I think I'd have to run DP from VC to TV and HDMI to my receiver.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
The difference is in the scaling algorithm. A TV does not do anything besides a basic bilinear (maybe bicubic) resizer. An HTPC can be configured to use all kinds of better algorithms like Lanczos or Spline36 (see madVR renderer). There are even more complex algorithms like SuperResolution, which are computationally too intensive to be performed in real time on 1080p24>4K24, but the point is that the possibilities are endless when you have processing power.
 

muskyx1

Member
Apr 20, 2005
151
1
81
Thanks, this explains what I observed at a PC retailer. Played a 1080p video through a BR-player that was connected to a 4k TV. Had them hook up a gaming PC with a 4k GPU and played the same video. There was a difference. Nothing drastic, but the PQ from the BR-player looked stretched out and faded unlike what I saw on the PC.

Strange that the store didn't have the 4k display hooked up to the gaming PC in the first place.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Your feedback suggests there was some distortion by the 4K TV, when it tried to display the 1080p content.

But was the TV upscaling, or was the BR player?

Also, upscaling 1080p does not take much brains, it's a simply conversion where the exact value of one pixel in 1080p is displayed on the 4K display using 4 pixels. So there is no need to do any fancy interpolating or anything. It's straight up zooming. So I'd think your visual impression of the stretching was due to some kind of problem with the TV/BR setup, as there is no need to stretch anything.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Your feedback suggests there was some distortion by the 4K TV, when it tried to display the 1080p content.

But was the TV upscaling, or was the BR player?

Also, upscaling 1080p does not take much brains, it's a simply conversion where the exact value of one pixel in 1080p is displayed on the 4K display using 4 pixels. So there is no need to do any fancy interpolating or anything. It's straight up zooming. So I'd think your visual impression of the stretching was due to some kind of problem with the TV/BR setup, as there is no need to stretch anything.


Umm no, unless you are keeping display size as a constant, that would be about the worst re-sizing you could do (nearest neighbor), full of aliasing artifacts.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Umm no, unless you are keeping display size as a constant, that would be about the worst re-sizing you could do (nearest neighbor), full of aliasing artifacts.

There is no nearest neighbor or aliasing when you display 1080p on a 4K display.

Do you think the following explanation is true? See:
http://www.ultrahdtv.net/what-is-ultra-hdtv/

Ultra HD Upscaling

The Ultra HD 4K resolution of 3840 × 2160 simplifies video scaling from the popular high-definition source formats 720p and 1080p. A 1080p video source can be scaled perfectly by simply doubling each pixel horizontally and vertically, using 4 pixels on the Ultra HD 4K display to represent each pixel from the 1080p source. Similarly, a 720p source pixel can be tripled horizontally and vertically, using 9 pixels on the 4K display for each pixel from the 720p source. The 720p and 1080p resolutions will also evenly divide the 8K resolution of 7680 × 4320.

See, there is no need to interpolate, or alias, or guess, or anything. You literally just use 4 pixels on the 4K panel to display every 1 pixel of the 1080p source.
 

muskyx1

Member
Apr 20, 2005
151
1
81
Your feedback suggests there was some distortion by the 4K TV, when it tried to display the 1080p content.

But was the TV upscaling, or was the BR player?

Also, upscaling 1080p does not take much brains, it's a simply conversion where the exact value of one pixel in 1080p is displayed on the 4K display using 4 pixels. So there is no need to do any fancy interpolating or anything. It's straight up zooming. So I'd think your visual impression of the stretching was due to some kind of problem with the TV/BR setup, as there is no need to stretch anything.


It was a cheap Samsung with no 4k up scaling capability.
 

spdfreak

Senior member
Mar 6, 2000
956
73
91
OP, what 4K capable video card are you planning to use? When I searched on NE, they only had 2 that listed 3840x2160 capability through HDMI. It is a problem... lots of cards will do 4K through DP but not HDMI. I'm trying to build a system to run dual 4K TV's for a surveillance system that is using 2560x1440 cameras and we would use 2K monitors but the largest we have found is 27in and they want 40in. I think the HDMI 2.0 spec will solve some of this but nothing uses it yet...
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
There is no nearest neighbor or aliasing when you display 1080p on a 4K display.

Do you think the following explanation is true? See:
http://www.ultrahdtv.net/what-is-ultra-hdtv/



See, there is no need to interpolate, or alias, or guess, or anything. You literally just use 4 pixels on the 4K panel to display every 1 pixel of the 1080p source.


You've got to be kidding me. One more time, unless you keep all dimensions constant, you cannot just display 1 pixel into 4 and get away with it. The whole point of 4K is to either 1) make the display bigger or 2) sit closer to your display, otherwise there is no point in increasing the resolution. And in either scenario if you don't properly interpolate 1080p, it will look like ass. It doesn't matter that it fits mathematically.

1080p on 50"
Image-before-scaling.png


1080p on 100" (or viewing at half distance) using your method
Image-after-trivial-scaling.png


1080p interpolated using bicubic scaling
Image-after-cubic-interpolation.png
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
You've got to be kidding me. One more time, unless you keep all dimensions constant, you cannot just display 1 pixel into 4 and get away with it. The whole point of 4K is to either 1) make the display bigger or 2) sit closer to your display, otherwise there is no point in increasing the resolution. And in either scenario if you don't properly interpolate 1080p, it will look like ass. It doesn't matter that it fits mathematically.

1080p on 50"
Image-before-scaling.png


1080p on 100" (or viewing at half distance) using your method
Image-after-trivial-scaling.png


1080p interpolated using bicubic scaling
Image-after-cubic-interpolation.png

You are right, I think it would make sense to try to interpolate to guess how to make the upscaling look better.

But I am torn between being a purist by only looking at the actual source information, vs. allowing the upscaling to try to guess at filling in the gaps with new information that was never in the original.

The text example above looks pretty good with interpolation. But surely there might be some example where the interpolation introduces some artificial information that looks a little odd or maybe it guessed wrong?

The aliasing on the text looks wrong, and the interpolation looks better, because we know the text should be a smooth slanted line so we are ok with adding additional information to guess at the true slanted nature of the text lines. But what if instead we are displaying a porcupine or something that was intended to be blocky/jaggy, more than just a pure slanted line? The interpolation would destroy that and blur it all together, something you don't want.

But it is interesting to think about what viewing distance and screen size combinations allow the human eye to perceive the differences. I think there could be many situations were you could, in fact, get away with the 1-to-4 simple zooming. And in situations where it just becomes noticeable, I still think to myself that it's the purest form of the original 1080p source, without any blurring/guessing thrown on top of it to try to make it look better. Maybe a good analogy would be how some people prefer to disable the faux 120 fps upscaling when viewing DVDs, preferring the original source 29 fps or whatever. You could argue that the 120 frames are 'better' but the purist might disagree because it's guessing at filling the gaps, where the original source never contained that data in the first place.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
You are right, I think it would make sense to try to interpolate to guess how to make the upscaling look better.

But I am torn between being a purist by only looking at the actual source information, vs. allowing the upscaling to try to guess at filling in the gaps with new information that was never in the original.

The text example above looks pretty good with interpolation. But surely there might be some example where the interpolation introduces some artificial information that looks a little odd or maybe it guessed wrong?

The aliasing on the text looks wrong, and the interpolation looks better, because we know the text should be a smooth slanted line so we are ok with adding additional information to guess at the true slanted nature of the text lines. But what if instead we are displaying a porcupine or something that was intended to be blocky/jaggy, more than just a pure slanted line? The interpolation would destroy that and blur it all together, something you don't want.

But it is interesting to think about what viewing distance and screen size combinations allow the human eye to perceive the differences. I think there could be many situations were you could, in fact, get away with the 1-to-4 simple zooming. And in situations where it just becomes noticeable, I still think to myself that it's the purest form of the original 1080p source, without any blurring/guessing thrown on top of it to try to make it look better. Maybe a good analogy would be how some people prefer to disable the faux 120 fps upscaling when viewing DVDs, preferring the original source 29 fps or whatever. You could argue that the 120 frames are 'better' but the purist might disagree because it's guessing at filling the gaps, where the original source never contained that data in the first place.

Let's not even dive into interpolating temporal resolution, which always results in garbage because it does not actually increase temporal resolution, but just creates the perception of it.

With regards to algorithms making mistakes... it depends how far you want to go. A simple bilinear 2x2 algorithm is exactly what you need to display 1080p into 4K smoothly without introducing anything negative happening. But when you start getting into non-mathematically exact scaling, bilinear begins to appear too "soft" or blurry. So we have bicubic 4x4, which produces a sharper image with less aliasing, without introducing too many ringing artifacts. Lanczos and Spline36 are even better at solving aliasing and preserving sharpness, but they can introduce too many ringing artifacts. So it's all a trade-off.

But yes if you want to be a purist on the same display size and viewing distance, then simply displaying 1080p into 4K with nearest neighbour would produce the same perceived image without having to modify the source, and without the need for any algorithms. But let's be real here, if people bought same size TVs as technology progressed, and constantly moved into larger houses, we'd all be watching 27"-32" TVs in 20'x20' rooms. Instead the trend is people moving into tiny apartments and hanging 60+ inch TVs on the wall.