- Jul 12, 2007
- 6,211
- 121
- 106
At CES this year there was a ton of talk about HDR capable televisions. I was a bit surprised at this interest (or even the apparent indication that HDR is something "new" in the display industry), because HDR has been implemented for a very long time in the video game industry. Indeed I believe it was Valve that introduced the video gaming world to the concept of HDR back in 2004, when it released its "lost coast" technical demo for its then to-be released blockbuster, namely Half-Life 2. Since that time, HDR has been implemented in innumerable games, provided that a consumer's graphics card was capable of rendering HDR content. Notably, new monitors were not required to display HDR content. Existing monitors could do so, provided the renderer (i.e., graphics card), could perform the necessary processing.
With the foregoing in mind, I have heard that in the 4k TV context, new panels (presumably implementing different display technology) will be needed to display HDR content. Juxtaposed against what happened in the PC context, this leads me to ask - what is different about HDR in the 4K tv context, relative to the PC context? Why is it that age old PC monitors can display HDR content rendered by a graphics card, but the latest and greatest TV panels are incapable of doing so even though they are able to push an insane amount of pixels?
My question particularly relates to the evolutionary TV line produced by Samsug, e.g., the 8550, 8700, and 9000 series 4K TV's. My impression (perhaps incorrect) is that when the panels of those TV's is connected to Samsung's oneconnect box, all of the image processing is done by the guts of the oneconnect box, and not by the circuitry within the panel. In essence, the oneconnect box is acting as a videocard to drive the panel. If that is the case, why couldn't HDR be implemented by Samsung in a new one connect box? Why would a new panel be required?
With the foregoing in mind, I have heard that in the 4k TV context, new panels (presumably implementing different display technology) will be needed to display HDR content. Juxtaposed against what happened in the PC context, this leads me to ask - what is different about HDR in the 4K tv context, relative to the PC context? Why is it that age old PC monitors can display HDR content rendered by a graphics card, but the latest and greatest TV panels are incapable of doing so even though they are able to push an insane amount of pixels?
My question particularly relates to the evolutionary TV line produced by Samsug, e.g., the 8550, 8700, and 9000 series 4K TV's. My impression (perhaps incorrect) is that when the panels of those TV's is connected to Samsung's oneconnect box, all of the image processing is done by the guts of the oneconnect box, and not by the circuitry within the panel. In essence, the oneconnect box is acting as a videocard to drive the panel. If that is the case, why couldn't HDR be implemented by Samsung in a new one connect box? Why would a new panel be required?
