Originally posted by: Modelworks
The review on this very site says that the display does not have a perfect gradient. If it is perfect how can their be any comparison, it should win outright without any hesitation.
Well the other two reviews stated it did, so what can we conclude from this? Perhaps AT?s display had QC issues, or perhaps not all gradient tests are created equal? That and my panel appears to have been upgraded since those reviews.
I know what banding is; I?ve seen it on many other LCDs. I also have excellent vision and can often read text at distances that my co-workers can?t. I?m not a blind moron or some kind of rabid LCD fanboy.
To put it simply: in the test you linked, on my display
I cannot see any banding or hard gradients in the first two images.
I also cannot see anything like is shown in their flawed example images.
As for the Dell, I?m not arguing which display is better, I?m merely pointing out that there are some excellent wide-gamut LCDs out there if you want to pay for them, and that they?re a definite cut above other LCDs I?ve used.
Maybe you got a bad CRT. LCD are getting better for color, but not there just yet.
Back lit LCD cannot produce true blacks like a CRT, it just isn't possible without using LED and lots of sensors.
Yeah, maybe. Or maybe a 102% NTSC gamut really is better than 85%. If it wasn?t, why would NEC go to all that trouble to create a display that has the highest possible NTSC color space?
I was actually quite concerned about black levels because I?ve seen many LCDs fare poorly there. But as soon as fired up Doom 3 and saw its rich inky black shadows, those fears were soon squashed. Colors are better too, especially the shades of green.
Subjectively to me, the colors on this display look better than my old CRT, and look vastly better than any other LCD I?ve seen.
12ms response time from white to black is nowhere near a CRT. LCD are getting better, but not there yet. If you used a CRT at 73hz then it is no wonder you like LCD. That would cause eye strain. CRT with 100hz is the way to go.
You?re arguing multiple terms and confusing them under one umbrella. Response time has absolutely nothing to do tearing, which in turn has absolutely nothing to do with flicker.
So let?s break them down to see where things actually stand:
Response Time: yes, like all LCDs, this one ghosts, but it?s extremely rare when it does, and it?s quite marginal when it happens. Essentially I have to be specifically looking for to notice it (kind of like the damper wires on my CRT), so I don?t consider this a huge advantage for the CRT because it doesn?t impede my ability to game.
Refresh Rate: I?m talking about full frames per second, not flicker. I?m not even arguing flicker, you are. At 73 Hz I found flicker on my CRT, but it was acceptable
for gaming as games tend to use darker surfaces which mask it, and in exchange I got to use 1920x1440. 73 Hz was far too low for desktop though, and I used 87 Hz there.
To do 100 Hz on my CRT I?d be required to game at 1280x1024 (down from 1920x1440), which looks like utter ass, CRT or no CRT. I?d always try to keep my resolution at 1600x1200 or better because there was clear image degradation below that. 1600x1200 looked visibly coarser than 1920x1440, but it was still acceptable.
Getting back to full frames per second, at 73 Hz the CRT could do 73 full frames per second while my LCD does 60. So obviously the LCD visibly tears more because I run without vsync, but tearing never bothered me before.
It doesn't matter how many pixels it has , it is the size of the pixels that matters.
It?s absolutely wrong to claim that pixel size is the sole metric for interpolation, and that pixel count means nothing. Your claim is trivial to disprove with basic mathematics.
I have two displays with same dot pitch, but one has a resolution of 6x6 pixels while the other has 4x4. Now I try to interpolate a 3x3 image onto them (full screen).
With the 6x6 display, it?ll be a perfect interpolation because each pixel gets mapped to four others. With the 4x4 display, some will get mapped to four pixels but others will get mapped to less than that, leading to an uneven image.
Interpolation forms the basis of rasterization so you?re essentially claiming that it doesn?t matter what resolution a game runs at, as long as the pixel pitch is the same? What utter nonsense.
Interpolation is based around sampling points and the more you have, the better it works. Obviously a tighter pixel pitch helps too, which is why I mentioned both as an advantage with this display.
But don?t take my word for it, take a look at the interpolation tests here:
http://www.prad.de/en/monitore...-hp-lp3065-part10.html
640x480 on a interpolated LCD looks better than a native CRT ? Only if you like to look at something that looks like someone smeared grease all over the screen.
I?m arguing that both displays start having display issues when the resolution drops. The LCD gets softer while the CRT gets coarser. It must do, because its pixels get bigger, and I?m not sure why people try to pretend otherwise.
I?m not going to argue the technical merits of the two, but subjectively I think I probably prefer playing Starcraft @ 640 x 480 on my 30? LCD than on my CRT.
Like I said before, I?ve seen a lot of LCDs, but this HP is really a cut above them all.