Originally posted by: xtknight
So your $169 monitor does over 2560x1600@60 Hz and supports a 330MHz pixel clock input? What model?
Actually it does it at 65Hz, which puts the pixel clock at 380MHz. And yes, it is perfectly clear in that resolution, but 65Hz refresh is hard on the eyes. Usually I use either 1920x1440 or 1856x1392 for better refresh. It's an LG Flatron 915FT+.
Also, my All-in-Wonder 9700 card won't let me output to TV as a 2nd monitor if the main display is cranked that high.
[/quote]That's like saying it doesn't make any sense to watch streaming media because it's digital and your CRT is analog.[/quote]
No, HDCP is a encryption format for digital signaling. Digital encryption doesn't make sense for a non-digital signal - it would have to be decrypted BEFORE converting it back to analog so it could be sent to VGA.
Your not going to be able to find a CRT anymore that can do those insane resolutions (and still look good) and have HDCP.
CRT monitors that can do those resolutions, and look good, are a dime a dozen (not literally). CRT monitors that do HDCP, however, do not exist.
And
wizboy11, if all I wanted was a CRT that performed well at 1920x1080, I could use my old 17" that I bought for around $250 in 1996. I'm willing to bet your Westinghouse won't outlast it, either.
Originally posted by: wizboy11
But ONCE you actually use and configure your own LCD, then, just then, you might see the light.
I use a midrange LCD every day at work. I am very frustrated with the lack of desktop real-estate (as in resolution).
The native resolution restrictions of most digital monitors suck ass as well. My old 17" could actually do 3000x2000 (just at a ridiculously low refresh, like 45Hz). It was unreadable at anything near that. (max readable was around 2048x1536) I would say 3000x2000 on the old 17" CRT looked about the same as 1600x1200 does on my work LCD, since the native res is only 1280x1024.