Toms review of Asus PB328Q 32-inch AMVA QHD

biostud

Lifer
Feb 27, 2003
19,485
6,549
136
http://www.tomshardware.com/reviews/asus-pb328q-32-inch-amva-qhd-monitor,4427.html

Besides getting a really good review, it is a 10-bit panel. Is that the same as 'HDR' that AMD talks about in the DP1.3 slides?

AMD-Radeon-RTG-Technology-Summit-DisplayPort-1.3-HDMI-2.0-AMD-FreeSync-Lenovo-Y700-High-Dynamic-Range-HDR-8-740x416.jpg


(Obviously this isn't at DP1.3 display, but an AMVA panel like this combined with freesync/G-sync, DP1.3 and a higher refresh rate would be the ideal non-UltraWide/4K gaming monitor)
 
Last edited:

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Nope, no HDR. But these test results are fantastic!

Wish I could trade my S32D850T in for this. Colors on par with most IPS panels, lower response times, and 75Hz.

And its only $469 on Newegg.:eek:
 

biostud

Lifer
Feb 27, 2003
19,485
6,549
136
Nope, no HDR. But these test results are fantastic!

Wish I could trade my S32D850T in for this. Colors on par with most IPS panels, lower response times, and 75Hz.

And its only $469 on Newegg.:eek:

Then what is the difference between HDR and a 10-bit panel?

amd-hdr-in-2016.jpg


Enter 10-bit, and HDR

The big push for 10-bit panels is due to the advent of High Dynamic Range content. This 10-bit content can have more detail in the bright areas of the image, and/or more details in the dark parts of the image. It can also have a greater range between the brightest and darkest parts of the image.

To use our example from before, in theory with HDR content, you could have the dark room while still seeing what was outside at the same time. Done right, the image looks far more realistic.

It also means more shades of color: 1,024×1,024×1,024= 1,072,341,824. Yep, one billion colors. Potentially.

http://hdguru.com/whats-a-10-bit-tv-and-is-it-better/
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Not sure there are definite 'specifications' that outlines what makes a panel HDR.

.1 cd/m^2 for max black luminance and ~345 cd/m^2 max white luminance is nothing novel or new.

Most articles I've read state that new 'HDR' panels should be somewhere between 500-2000 nits.

Here's an article:
http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975

Some hightlights (BT.709 vs. DCI P3 vs. BT2020):
There is more to “HDR”
As discussed in the previous sections, HDR requires higher brightness levels, deeper bit rates for colors, and a new PQ format. An interesting thing to observe is that the industry intends to do more. With higher brightness come better colors. Or to be precise; the possibility of a wider color gamut. A wider color gamut is not an element of “HDR” per se but when most people in the industry say “HDR” they typically mean better colors, too.


Today’s TVs use the so-called BT.709 color gamut, which can reproduce only around 35% of the colors that the human eye can perceive. The first HDR-enabled TVs are capable of reproducing most of the DCI P3 color space that cinemas use. DCI P3 covers approximately 54% of the colors we can see. But the industry has proposed a new far more ambitious BT.2020 color gamut that covers almost 76%!

The same can be said for 4K Ultra HD resolution. HDR works with HD resolution, sure, but no one seems interested in making it happen, so usually when you hear “HDR” it will imply HDR in 4K resolution.

As said, these things are not actually part of “HDR” but the industry seems to be taking the step to first DCI P3 and later BT.2020 with the introduction of HDR. And that is amazing! Full BT.2020 coverage will likely take some years to fulfill and as the name suggests it is actually a recommendation for year 2020. But the industry is moving forward and it looks like we could see the first high-end TVs with full support quite soon.
colorgamuts2.jpg


A completely new foundation - a technical look
The subject of gamma and light is beyond the scope of this article but to fully comprehend HDR it is important to understand that most of the picture standards for today’s TVs were developed based on CRT (cathode ray tube) displays. We have yet to define fundamental standards for digital displays.

Today’s TVs use an EOTF (Electro-optical Transfer Function) method to convert an input signal into visible light (and subsequently an image), and this system still relies on the characteristics of analog CRT displays, the so-called gamma curve. This is why displays use a gamma function (typically 2.2 or 2.4). We often refer to this gamma curve in our reviews.

However, today’s LCD and OLED display technologies are capable of more than CRT, and with HDR it looks like we will finally develop display standards that are based on the characteristics of the human eye instead of the limitations of an old analog display technology.

Before we get to that consider the following. Movies and TV shows are created and graded based on these principles that assume a maximum brightness level (white) of around 80-120 nits (or cd/m2) and a minimum (black depth) of around 0.05 cd/m2 for living room TVs (around maximum 48 nits for cinema). Absolute black is zero and the best consumer displays such as OLED can reach that. Modern TVs can also go way beyond 80-120 nits for maximum brightness, which means that most TV manufacturers have tried to “enhance” the picture in numerous ways, not unlike how TV manufacturers try to “enhance” motion. The content creators hate it but the point is that our displays are capable of more than the standards allow.

Unfortunately, most people associate “high brightness” on displays with bad things due to how TV manufacturers have approached it in the past. We often hear questions like “is HDR kind of like the dynamic mode on my TV”? Forget those thoughts for now and consider that a typical day with thin clouds equals something like 4000-7000 nits and a sunny day has an ambient light level of over 30,000 nits. Direct sunlight is even more extreme. We obviously don’t want to have to wear sunglasses in front of our TV but if we want to recreate the real world on a display there is no other way; we need higher brightness. Also, remember that the human eye dynamically adapts to light in our environment by closing and opening the pupil. That is how the human vision dynamically adjusts to daytime and night time.

So how much brightness do we need? That is a subject for debate. Dolby believes that we need a dynamic range of 0 to 10,000 nits, even though its Dolby Vision format usually has a lower maximum. The Blu-ray association recommends that “over 1000 nits should be limited to specular highlights”. Below you see the results of Dolby’s research.
dolbyluminance.jpg
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
HDR requires 10 bit but also some other things. The 10 bit is to have similar precision to current 8 bit while having a much bigger range of brightness.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
If this panel only had Freesync it would be the perfect budget 32" panel.

Agreed.:thumbsup:

Have a feeling it won't be long now until most, if not all, mid-to-high range panels will come with some sort of adaptive sync.

Maybe even sneak in to some TVs here in the states.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
The difference between HDR and simply 10-bit, is that HDR includes a luminance range for those 10 bits. An HDR display requires a good amount of contrast to pull off. This panel could probably do it half-way decently in theory, but it is not setup to receive an HDR signal. You're not going to see an IPS monitor doing properly anytime soon, if ever.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Agreed.:thumbsup:

Have a feeling it won't be long now until most, if not all, mid-to-high range panels will come with some sort of adaptive sync.

Maybe even sneak in to some TVs here in the states.


Alao given AMD is adding Freesync over HDMI I don't why TV manufacturers wouldn't embrace it. Perhaps when this happens Nvidia will cave and start supporting it.

There's already a decent list of panels here to get things started:

http://www.amd.com/Documents/freesync-hdmi.pdf

Too bad the consoles use older GCN 1.0 so you won't see them supporting it :(
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
The difference between HDR and simply 10-bit, is that HDR includes a luminance range for those 10 bits. An HDR display requires a good amount of contrast to pull off. This panel could probably do it half-way decently in theory, but it is not setup to receive an HDR signal. You're not going to see an IPS monitor doing properly anytime soon, if ever.


LG is releasing new 4K IPS displays this year that support HDR. They announced them at CES 2016 last month so we know fundamentally IPS isn't a limiting factor for HDR.

Hopefully this will translate into more than just AMVA/VA computer monitors having HDR as they still have their limitations (worse rise fall times, colour accuracy and viewing angles) compared to IPS.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
These 32(31.5?) inch QHD monitors seem good for those that like the PPI of a 23-24 inch 1080 display. I'm personally looking forward to more 4K displays at this size.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
LG is releasing new 4K IPS displays this year that support HDR. They announced them at CES 2016 last month so we know fundamentally IPS isn't a limiting factor for HDR.

Hopefully this will translate into more than just AMVA/VA computer monitors having HDR as they still have their limitations (worse rise fall times, colour accuracy and viewing angles) compared to IPS.

Until we see some sort of local dimming on a PC monitor, then yes, IPS is a fundamental limiting factor. A VA panel has the baseline contrast to do it on its own (barely), but an IPS does not. LG is only getting its IPS TV's HDR certified by using local dimming.

I'm also convinced the UHDP certification is a crock. They state .05nits and 1000 nits as the luminance range to qualify, but if they are passing edge-lit sets, then they are not using static numbers, and we will see a huge variation in the performance of sets that all claim the same luminance range.
 
Last edited: