Someone explain to me why HDR matters?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Except it really doesn't matter what kind of display you're using because while video cards can internally handle HDR, since AFAIK none of them actually output HDR to the display's framebuffer.
The display doesn't have a framebuffer, only the GPU does.
I meant the output buffer right before either the DAC or the DVI connection.

From what I understand, while video cards can internally render using floating point notation, the framebuffer that's right before the DAC is a 24-bit integer buffer and outputs that same 16.7M colors over VGA regardless of your monitor.
Sure but the data was composed from FP/shader blending and hence it has a much higher dynamic range due less truncation/rounding issues and similar while it gets to the framebuffer. This still makes a difference regardless of the integer storage at the end.
You still have the exact same dynamic range as before, you do realize that right? Your CRT still gets voltage signals in the same range over VGA and your LCD still gets the same numbers over DVI (excluding the Brightside LCD which no one actually has).

if you had reworked a monitor's s color space to be floating point then HDR would work.
No reworking is required to make HDR viable. When done right it looks great (Painkiller, Riddick, HL2, etc).
From the screenshot I've seen, it looks nice sometimes, and sometimes it looks horrid. Look at the screenshots I posted, they look horrid.

In all three of these, half the scene is clipped because of insufficient dynamic range.
If half of the scene was clipped then it wouldn't be rendered at all, which is clearly not the case.
Clipping in the DSP sense, not in the 3D pipeline sense. Clipping in the sense that there's insufficient dynamic range so you have a crapload of quantization error. You see how half the scene is completely white? That's because HDR is useless in a 24-bit integer color space.

It absolutely does. If your monitor can't handle the varriance in luminance the scene has then you will not get proper lighting displayed in front of you- there is no way around this.
Yes, you will get proper lighting with a very compressed dynamic range. You can split the range 0 to 1 with fp32 and get "correct" lighting in the same sense that you can split the range 0 to 1 billion with fp32 with "correct" lighting. Obviously you have better dynamic range with the larger range, but both are "correct". There's no absolute measure of brightness once the monitors are calibrated (at the factory and by you).


BTW mainstream CRTs and LCDs didn't just suddenly acquire new dynamic range nor did it suddenly get "unlocked" by HDR. For VGA output, each sub-pixel receives a voltage that corresponds to the range [0,Vmax], which is exactly the range you had before video cards started displaying HDR. Same with LCDs, they still have the exact same dynamic range as they previously had; monitors don't suddenly acquire new ways to display images.

I'm not saying that my monitor should be so bright that I have to wear sunglasses to watch it, I'm just saying that the HDR screens that I've seen are for the most part ridiculously overexposed to show off a "blooming" effect. If Brightside uses fp16 to modulate brightness levels, then that's great; that's how HDR *should* be done.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
You still have the exact same dynamic range as before, you do realize that right?
For practical purposes you don't because the accumulated frame buffer holds different (and more accurate) data with HDR rendering than it does with non-HDR. If the display is the absolute limitation then there would be no difference between HDR and non HDR, a scenario that is clearly false.

Following your logic AA is also useless since the frame buffer has the same precision - and likewise the display has the same limitation too - before and after it is used. Again, this is absolutely nonsensical reasoning.

From the screenshot I've seen, it looks nice sometimes, and sometimes it looks horrid. Look at the screenshots I posted, they look horrid
That's your opinion, none of which is relevant to the fact that HDR does make a difference for the better regardless of the limitations of current displays.

Clipping in the sense that there's insufficient dynamic range so you have a crapload of quantization error.
You have far more errors with non HDR rendering yet I don't ever recall you complaining about it. Current HDR may not be perfect but it matches reality much closer than non-HDR rendering.

You see how half the scene is completely white? That's because HDR is useless in a 24-bit integer color space.
Or it could be that the artists/programmers designed it look like that in order to generate a large contrast between the light and dark arears. HDR doesn't have to be the same across implementations any more than textures or geometry have to be the same.

BTW mainstream CRTs and LCDs didn't just suddenly acquire new dynamic range nor did it suddenly get "unlocked" by HDR.
I'm not really sure what the point of this comment is as there is a clear difference between HDR and non-HDR images, regardless of whether the display acquired anything new or not. The difference is coming from HDR rendering, not from the display. It seems to me that you're arguing an irrelevant tangent.

If Brightside uses fp16 to modulate brightness levels, then that's great; that's how HDR *should* be done.
So basically it's just your opinion that HDR doesn't look nice and you're trying to project that opinion as fact using display limitations. Like I said before that is an invalid course of reasoning. Whether you like the images or not doesn't change the reality that they are more accurate than non-HDR images.

I'm not sure why this myth of displays being the hard limit keeps getting repeatedly spread but it needs to stop. Here's a good quote I found from someone a little higher up than myself about the effects of better internal precision:

?The advent of 32-bit floating-point pixel precision makes it possible to create high-quality images. Efficient volumetric effects?ground fog, spherical fog, sprites that fade out smoothly rather than getting clipped by world geometry?can be based on buffering and reusing per-pixel 32-bit floating-point z. Precise per-pixel lighting attenuation formulas can be based on passing in light-source positions in 32-bit floating-point vector registers. And much higher-quality bump-mapping is possible with 16- and 32-bit floating point. With just 8 bits per component, there were noticeable artifacts and un-normalized bump maps.?
Tim Sweeney, Epic Games
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
This argument is going in circles so I'm going to leave this here after this post but:
A) Tim Sweeney's comment refers to internal quantization errors, not dynamic range.
B) Look out the window, even on a bright day it doesn't look anything like what HDR looks like in those scenes.
C) I'm not arguing that the idea of HDR is bad, it certainly isn't, I'm saying that HDR in its current implementation is just trading one kind of error for another. Now if you think it's fantastic then good for you. Personally, I just see it as being incorrect in a different kind of way.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
A) Tim Sweeney's comment refers to internal quantization errors, not dynamic range.
His general points still apply to HDR: better internal precision translates to better overall precision even if some data is lost at the end stage(s).

I also forgot to mention that NV4x, G70 and R520 (presumably) can output to a floating point frame buffer if requested.

I'm not arguing that the idea of HDR is bad, it certainly isn't, I'm saying that HDR in its current implementation is just trading one kind of error for another.
While this may be true to some degree the end results is that HDR is more accurate than non-HDR rendering.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What is the contrast ratio of a decent AG CRT (but not some $50,000 specialized beast, just a consumer AG CRT)?

A good one is around the 10,000:1 mark- a POS one is around 2,000:1.

CRT v LCD in contrast ratio.

Some LCDs with LED backlighting best CRTs AFAIK, or at least the BrightSide does (that's what the article sounded like). I know it's $50,000 but it's still an LCD, and hopefully it will come down in price.

Using standard VESA testing Brightside manages 25,000 to 1 which is certainly exceptional(nowhere near what their PR people would have you believe) but this is due to the LED array, not the LCD itself.

I really doubt you could tell the difference between 0 and 0.2.

That difference is fairly large actually, and overwhelmingly LCDs with that kind of rating can only hit it at unuseable settings.
 

aidanjm

Lifer
Aug 9, 2004
12,411
2
0
Originally posted by: Jeff7181
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

not only do I expect it, I demand it :)

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: BenSkywalker
A good one is around the 10,000:1 mark- a POS one is around 2,000:1.

CRT v LCD in contrast ratio.

OK, yes, but the LCD can still show a much brighter light than the CRT, correct (thus it can display a brighter in-game "sun")? 'Peak luminance' in the CRT specs I listed is the same as 'brightness' in the LCD test? The HDR I'm seeing in Lost Coast makes everything extremely bright so it seems more important to me to have a higher white level. I can't see why you think a CRT is much better for HDR. You're gaining 150 candelas on the white level (for the 770P vs. CRT@6500K) and just losing 0.2 (or maybe 1.0) on the black. We can agree to disagree I guess. The contrast ratio might be higher because of the lower black level but that doesn't tell the whole story. I wish they wouldn't always reduce the fraction as much as possible because for example if it was 500:2 I could actually tell the 500 was white level and the 2 was black level. But 250:1 doesn't tell me anything. It could have a white level of 250000 and a black level of 1000 for all I can tell (and I would go blind within seconds :p). Well, not that I could believe their stupid overrated specs in the first place anyway...I could just get the black level by dividing "brightness" by the contrast ratio.

Using standard VESA testing Brightside manages 25,000 to 1 which is certainly exceptional(nowhere near what their PR people would have you believe) but this is due to the LED array, not the LCD itself.

An "LCD" doesn't mean a device that uses the cold cathode bulbs. It just means it uses liquid crystals to display something so I don't really know what you're saying here that it's not due to the LCD (I was aware of the VESA->25000:1 part). I still thought they said it could display a perfect black due to the absense of an LED being lit, or that is BS too? Did they have an LED per each pixel? I'm pretty sure they just had one LED for a group of pixels?

That difference is fairly large actually, and overwhelmingly LCDs with that kind of rating can only hit it at unuseable settings.

It's supposed to be at best-possible calibrated settings. I hope those are usable. :Q

I don't know what to believe when you say you can tell a difference between 0.0 and 0.2 cd/m² because you also say 1600x1200 is a small resolution. :p Could a soccer mom tell the difference (is there easily visible gray or not really)? Is it the difference between not being able to see a tunnel and being able to see it? For the record, what's the black level of a decent CRT? Doesn't some light leak among the shadow mask or no? Also what is the contrast ratio of a good shadow mask CRT if you know off the top of your head?

If the Iiyama has a max luminance of 80 at 6500K like that expensive NEC, then I can extrapolate the black level is 0.008497079129049389272437599575146 with a contrast ratio of 9415:1. I wish I knew how bright that was...

BFG10K: He means the monitor can still only display 256*256*256 colors, that the monitor can't instantly display any more. The data in the frame buffer is still converted to the normal 24-bit (0-255,0-255,0-255) RGB we've been using unless you're talking about the dual-link DVI that sends special HDR data on the Brightside LCD.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: xtknight
Originally posted by: BenSkywalker
A good one is around the 10,000:1 mark- a POS one is around 2,000:1.

CRT v LCD in contrast ratio.

OK, yes, but the LCD can still show a much brighter light than the CRT, correct (thus it can display a brighter in-game "sun")? 'Peak luminance' in the CRT specs I listed is the same as 'brightness' in the LCD test? The HDR I'm seeing in Lost Coast makes everything extremely bright so it seems more important to me to have a higher white level. I can't see why you think a CRT is much better for HDR. You're gaining 150 candelas on the white level (for the 770P vs. CRT@6500K) and just losing 0.2 (or maybe 1.0) on the black. We can agree to disagree I guess. The contrast ratio might be higher because of the lower black level but that doesn't tell the whole story. I wish they wouldn't always reduce the fraction as much as possible because for example if it was 500:2 I could actually tell the 500 was white level and the 2 was black level. But 250:1 doesn't tell me anything. It could have a white level of 250000 and a black level of 1000 for all I can tell (and I would go blind within seconds :p). Well, not that I could believe their stupid overrated specs in the first place anyway...I could just get the black level by dividing "brightness" by the contrast ratio.

Using standard VESA testing Brightside manages 25,000 to 1 which is certainly exceptional(nowhere near what their PR people would have you believe) but this is due to the LED array, not the LCD itself.

An "LCD" doesn't mean a device that uses the cold cathode bulbs. It just means it uses liquid crystals to display something so I don't really know what you're saying here that it's not due to the LCD (I was aware of the VESA->25000:1 part). I still thought they said it could display a perfect black due to the absense of an LED being lit, or that is BS too? Did they have an LED per each pixel? I'm pretty sure they just had one LED for a group of pixels?

That difference is fairly large actually, and overwhelmingly LCDs with that kind of rating can only hit it at unuseable settings.

It's supposed to be at best-possible calibrated settings. I hope those are usable. :Q

I don't know what to believe when you say you can tell a difference between 0.0 and 0.2 cd/m² because you also say 1600x1200 is a small resolution. :p Could a soccer mom tell the difference (is there easily visible gray or not really)? Is it the difference between not being able to see a tunnel and being able to see it? For the record, what's the black level of a decent CRT? Doesn't some light leak among the shadow mask or no?

If the Iiyama has a max luminance of 80 at 6500K like that expensive NEC, then I can extrapolate the black level is 0.008497079129049389272437599575146 with a contrast ratio of 9415:1. I wish I knew how bright that was...

BFG10K: He means the monitor can still only display 256*256*256 colors, that the monitor can't instantly display any more. The data in the frame buffer is still converted to the normal 24-bit (0-255,0-255,0-255) RGB we've been using unless you're talking about the dual-link DVI that sends special HDR data on the Brightside LCD.

We always would rag on my Dad for the same kind of attitude that Ben has... Basically, no one else can see these "imperfections"... "You know what Dad, no one here can see that mark on the floor that we made, even with a microscope"... But Dad sure seen it... If Dad seen it then it was there or else it was the belt time... LOL, My Dad is cool, but that is one thing I majorly dislike is when people blow things way out of proportion and then try and convince others of it. Uhmmm no!

1600 X 1200 isn't a low resolution and has already been refuted in a detailed post somewhere in the archives. It is an improper use of the term to call it "low" and "small" when several other lower options exist.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Low is a relative term, so it may be low for him and high for me, and I perfectly respect that. However when you explain something to someone who has a different perception of said relative terms, it's hard to see what they're saying without extrapolating it. My LCD is supposed to have a black level of 0.42 and personally I think the black on it is awful but that may be because of the backlight bleeding on the edges. If it was 0.42 all over, I don't think I could tell the difference on a CRT. I notice in small spots, the black is...real black. Black enough for me that is, and that's what matters (for me). ;)
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: xtknight
Low is a relative term, so it may be low for him and high for me, and I perfectly respect that. However when you explain something to someone who has a different perception of said relative terms, it's hard to see what they're saying without extrapolating it. My LCD is supposed to have a black level of 0.42 and personally I think the black on it is awful.

In this particular context to resolutions displayed on a monitor, low isn't a relative term unless the word is bastardized beyond use of normacy. If you play the "relative" type game you can get out of anything said or done on a forum, which is complete BS. That is precisely what people will do to get out of a "said" comment and thus blaming you for their hyperbolic type speach.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
The difference in contrast is enormous at least to me, even on LCDs with relatively high contrast (and "normal" brightness) ratings. One thing to keep in mind is that it makes a huge difference whether you use the monitor in the daytime or a pitch dark room.

I'm not sure about aperture grill CRTs in general but I remember finding some NEC brochure type thing about a year ago that said that the Superbright Diamondtron monitors could show a maximum of 280cd/m^2 brightness, which is comparable to most LCDs. Subjectively speaking, these monitors (or at least the one I have) do look completely different from other CRTs I've seen with either of the superbright modes on.

An "LCD" doesn't mean a device that uses the cold cathode bulbs. It just means it uses liquid crystals to display something so I don't really know what you're saying here that it's not due to the LCD (I was aware of the VESA->25000:1 part). I still thought they said it could display a perfect black due to the absense of an LED being lit, or that is BS too? Did they have an LED per each pixel? I'm pretty sure they just had one LED for a group of pixels?

Those LED LCDs actually don't have very good contrast ratios at all; the NEC one for example is only 430:1, although that's taken at a relatively low brightness of 200cd/m^2. I think there are 48 LEDs used in the monitor and their main purpose is to provide a more uniform and accurate backlight. It probaly also gets rid of the slightly bluish look that CCFL backlights give, although I personally don't mind that.

Anyway, HDR lighting (not the view blooming, which I think looks ridiculous and way overdone) in games looks excellent if done well. I think Splinter Cell Chaos Theory currently has the best looking implementation of it out of the stuff I've seen. One thing that I recently discovered was that the X1800 cards are advertising being able to output a 10-bit signal over VGA when running in HDR. That would be pretty neat, but is it a unique feature or can any other cards also do this?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: aidanjm
Originally posted by: Jeff7181
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

not only do I expect it, I demand it :)

Don't forget your SPF 100.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: xtknight
Low is a relative term, so it may be low for him and high for me, and I perfectly respect that. However when you explain something to someone who has a different perception of said relative terms, it's hard to see what they're saying without extrapolating it. My LCD is supposed to have a black level of 0.42 and personally I think the black on it is awful but that may be because of the backlight bleeding on the edges. If it was 0.42 all over, I don't think I could tell the difference on a CRT. I notice in small spots, the black is...real black. Black enough for me that is, and that's what matters (for me). ;)

In amount of pixels, 1600x1200 is essentially the 4:3 version of 1920x1080, so if 1600x1200 is low res, then so is the highest HDTV standard. That, and for new games to even play well at 1600x1200, it usually requires at least a $400 video card.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
BFG10K: He means the monitor can still only display 256*256*256 colors, that the monitor can't instantly display any more.
I know exactly what he's saying.

The data in the frame buffer is still converted to the normal 24-bit (0-255,0-255,0-255) RGB we've been using unless you're talking about the dual-link DVI that sends special HDR data on the Brightside LCD.
Going back to back to my AA example: can we claim 4xAA makes no difference to 1600x1200 because in the end you still have 1600x1200 pixels to work with? Of course not. Likewise HDR isn't useless even if it needs to be scaled back down to 24 bit INT at the end. Sure, the scaling isn't ideal but it's still better than not doing any HDR.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: CP5670
The difference in contrast is enormous at least to me, even on LCDs with relatively high contrast (and "normal" brightness) ratings. One thing to keep in mind is that it makes a huge difference whether you use the monitor in the daytime or a pitch dark room.

It all depends which LCD you're talking about. The 770P is in a league with no other in terms of colors and contrast.

I'm not sure about aperture grill CRTs in general but I remember finding some NEC brochure type thing about a year ago that said that the Superbright Diamondtron monitors could show a maximum of 280cd/m^2 brightness, which is comparable to most LCDs. Subjectively speaking, these monitors (or at least the one I have) do look completely different from other CRTs I've seen with either of the superbright modes on.

Hmm. Considering the $5000 Diamondtron only had a brightness of 100 cd/m² I find that hard to believe. Maybe it was a special coating such as X-Brite.

It probaly also gets rid of the slightly bluish look that CCFL backlights give, although I personally don't mind that.

THANK GOD. Nothing annoys me more than that (not being sarcastic). I hate that blue! How does black level take that in to account? It's a combo of the blue and the backlight bleeding thats annoying. The NEC LCD1980SXi reaches a brightness of 270 cd/m² and contrast ratio of 500:1. So a black level of 0.54. However if there's no blue and no bleeding and if my LCD at 0.42 is any indication (in a small area where it's not blue or bled), that's not bad at ALL.

One thing that I recently discovered was that the X1800 cards are advertising being able to output a 10-bit signal over VGA when running in HDR. That would be pretty neat, but is it a unique feature or can any other cards also do this?

I'm pretty sure Matrox's can.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Well, I never said it was useless (I love Lost Coast), just that for now it is using the same 24-bit color in the end (for the consumer market). However now it makes the best possible use of that 24-bit range, that's all, and now we have coined it "HDR" and seen wide-spread usage of it. It may be obvious to you but perhaps some people who came to this thread did not know that the way it was being touted and that's what the OP wanted to clear up.

Sure, same thing with 4xAA. That is not useless. We still however use the 1600x1200 pixels we always did. Nothing magical in the monitor is unlocked. It makes good use of those 1600x1200 pixels.

I echo what he said:

Originally posted by: Dribble
? to this discussion

As I understand HDR means internally graphic cards use longer numbers (more bits to a float) to allow them to calculate more complex lighting effects without running into rounding errors. To us it means they can now simulate lighting effects such as bloom which more accurately reflects the way our eyes respond to light.

This has nothing to do with the monitor, the output is still 24 bit colour and that's fine. You don't need a monitor that can be as bright as the sun - it's not the monitor that's doing hdr (e.g. individual pixels are so bright they blind you a bit) it's the graphics card, the monitor just displays the result as it always has.

As to clipping, I presume by that you mean bits of the scene look too white - it's meant to, if you stare at something really bright your eyes do that. The pictures are correctly showing what the HDR is trying to do.

I agree with this as well. It's one error for another, but the end result comes out more accurate.

Originally posted by: BFG10K
I'm not arguing that the idea of HDR is bad, it certainly isn't, I'm saying that HDR in its current implementation is just trading one kind of error for another.
While this may be true to some degree the end results is that HDR is more accurate than non-HDR rendering.

So to answer the OP's question: Today's "consumer" HDR matters because it utilizes the best it can in the 24-bit range in stark contrast to "without HDR" rendering. Sorry for that awful pun, it was unintended, I promise.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
It all depends which LCD you're talking about. The 770P is in a league with no other in terms of colors and contrast.

Did it just come out? Would be interesting to see it in person. At least the VP930b, 910T and 1905FP, which are all fairly highly rated in contrast, didn't look too great to me in the dark, and the 2405 was quite a bit worse. The LCD on my Dell laptop is, well, horrendous would be putting it lightly, but I suppose that's just an extremely crappy one. :p

Hmm. Considering the $5000 Diamondtron only had a brightness of 100 cd/m² I find that hard to believe. Maybe it was a special coating such as X-Brite.

They all have that same coating as far as I know. I think for that $5000 one you need to use it in sRGB mode to take advantage of its extra color range (the reason why it's so much more expensive than the normal 2070/2141), with which superbright mode cannot be used, so that's probably why it's only rated at 100.

THANK GOD. Nothing annoys me more than that (not being sarcastic). I hate that blue! How does black level take that in to account? It's a combo of the blue and the backlight bleeding thats annoying. The NEC LCD1980SXi reaches a brightness of 270 cd/m² and contrast ratio of 500:1. So a black level of 0.54. However if there's no blue and no bleeding and if my LCD at 0.42 is any indication (in a small area where it's not blue or bled), that's not bad at ALL.

I guess that blue tinge is fairly noticeable, just that it doesn't bother me quite as much as the black levels. The LCD1980SXi uses a normal backlight though; NEC's only LED one currently available is the LCD2180WG-LED model.

I'm pretty sure Matrox's can.

I mean any of the gaming cards, specifically the 6800 or 7800 ones that can also do HDR.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: CP5670
Originally posted by: xtknight
It all depends which LCD you're talking about. The 770P is in a league with no other in terms of colors and contrast.

Did it just come out? Would be interesting to see it in person. At least the VP930b, 910T and 1905FP, which are all fairly highly rated in contrast, didn't look too great to me in the dark, and the 2405 was quite a bit worse. The LCD on my Dell laptop is, well, horrendous would be putting it lightly, but I suppose that's just an extremely crappy one. :p

It's coming very soon and already available in foreign countries.

The LCD1980SXi uses a normal backlight though; NEC's only LED one currently available is the LCD2180WG-LED model.

Oh, whoops.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Bump. Can anyone elaborate on my prior reply to Ben and if what I said was true about CRTs/LCDs?