New Large Moniter

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

natto fire

Diamond Member
Jan 4, 2000
7,117
10
76
Originally posted by: SonicIce
Originally posted by: cscpianoman
DVI is better. DVI is a digital interface designed for LCD's which are digital. CRT's are analog, so you won't find any CRT's that have DVI ports.

think again

It should be noted, however, that the signal will still be converted to analog before being displayed. A true DVI interface is pixel for pixel the exact image the video card is rendering. Whether that monitor or a decent video card has better D/A circuitry is another debate all-together.
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91
Do CRTs with DVI connections use the digital or analog portion of the DVI signal?
 

Ken90630

Golden Member
Mar 6, 2004
1,571
2
81
Originally posted by: YOyoYOhowsDAjello
Do CRTs with DVI connections use the digital or analog portion of the DVI signal?

I believe a CRT with a DVI connector would only be compatible with an analog DVI-I signal.

As you prolly know, there are two types of DVI connectors: DVI-I (the connector can carry either digital or analog signals) and DVI-D (the connector can only carry a digital signal). I think that if a CRT used a digital signal input, it would then have to be converted to analog in order to be displayed (and a digital-to-analog converter would add to the cost). It seems to me this would be pointless, as it would be simpler and cheaper just to use an analog input on the CRT. :)

Having said that, I've been embarrassed a few times by assuming logical thought processes among mfgrs when it comes to computer products, so the above info is stated with, lets say, only 99% confidence. (For all I know, maybe Thermaltake makes a flourescent lime green CRT that takes a DVI-D input and uses a D-to-A converter inside. :laugh: )

Just a joke, Thermaltake fanboys. Put your flamethrowers away. :D
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
What makes an LCD "digital"? why are CRTs considered "analog"? how can a device or peripheral be considered "analog" or "digital"? why arent there CRTs with a real digital input? I thought analog or digital was only referring to the way the device transmits data from one device to another, not the device itself.
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
Originally posted by: YOyoYOhowsDAjello
If I was getting a new monitor I'd probably go with an NEC 1250+ since I really like my 950+.

Man i'm still on the edge between that and the P260 :p. gunna buy real soon. im leaning towards the NEC only because I saw a toms hardware review that said it was great and there was a Sony G520 that did very poorly, and I think the P260 uses the G520's tube. this is the only review i've seen that sort of has both in it. the P260 does have an advantage in specs though, with 1920x1440 at 75Hz instead of the NEC's 73Hz and the P260 has dual inputs with neat front mounted switch.
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
LCDs are "digital" because the signal they take is a digital one. The individual subpixels on an LCD are fixed and only have 6-8 discrete modes, based on the color bitdepth of the panel. If you feed the LCD panel with an analog signal via your regular D-15 cable, an internal ADC will have to convert it back to digital. Hence, they are considered "digital" devices.

CRTs are "analog" because the signal they take is an analog one. There are no fixed pixels. Instead, phosphors on the CRT display glow when the CRT gun shoots it, and the resultant "pixel" has no predefined discrete state, but rather is continuous(you never hear a CRT having a 16/24/32/48/etc bit color reproduction range because it's for all intents and purposes infinite) hence they are "analog".
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
I've seen quite a few FDTrinitrons in my day, and I haven't come across any that are bad. They are some of the best CRT tubes around, along with the Mitsubishi DiamontronNF that NEC use. I don't think you'd be disappointed in any FDTrinitron, unless it's a defect in the first place, which aren't that uncommon in refurb'ed monitors. FYI, like I mentioned, I had to go through 2 Dell P1130 monitors(with 21" FDTrinitron tube) because the first one had a cracked chassis. Well, the image on the 1st one was extremely good and sharp. I got my 2nd one after a free RMA, and it came without any major physical damages, but the image was clearly inferior to the first one. It's still good, just clearly not as good.

Anyway, go with the one that supports a higher resolution/refresh rate. Dual inputs are cool if you need them, but otherwise useless.
 

Ken90630

Golden Member
Mar 6, 2004
1,571
2
81
Originally posted by: SonicIce
What makes an LCD "digital"? why are CRTs considered "analog"? how can a device or peripheral be considered "analog" or "digital"? why arent there CRTs with a real digital input? I thought analog or digital was only referring to the way the device transmits data from one device to another, not the device itself.

It's late and I'm way too tired for a properly thorough explanation here, but I'll jot down a quick reply:

As I understand it, "digital" devices process signals/data in the form of 1's and 0's. In the case of digital sound, as an example, analog waveforms are (at some point in the sound's production) sampled thousands of times a second and the results are converted into 1's and 0's. In a typical commercial CD, for instance, the original analog waveform has been sampled about 44,000 times a second to create the unique pattern of 1's and 0's that represent that analog waveform and the sound that waveform created. Digital data is processed similarly -- 1's and 0's (or "on & off" signals as they're sometimes called) that collectively represent whatever is being created or processed.

"Analog" devices operate via signal waveforms (or in the case of music, like with a piano, for instance, acoustic waveforms). There is no conversion of the waveform(s) to 1's and 0's like there is with "digital" mediums. In the case of an analog CRT monitor, it is receiving an electrical waveform signal instead of a stream of 1's and 0's. The phosphor elements in a CRT react to an analog waveform, not a digital stream of 1's and 0's. This is why I said earlier that if an analog CRT were, for some reason, to receive a digital signal, it would have to convert it to analog in order for the display to work. Conversely, in the case of an LCD monitor, it can receive an analog signal, but if it does, it converts it into a digital signal so that the pixels will light up. The best way to view an LCD monitor is with a digital signal.

This is a somewhat crude and abbreviated explanation, but covers the basics.

I thought analog or digital was only referring to the way the device transmits data from one device to another, not the device itself.
Nope. An iPod, for example, is a digital device. LCD displays are digital devices (and use an analog-to-digital converter when fed an analog signal). CRTs are analog devices. Hard drives and DVD players are digital devices (data is encoded digitally, as 1's and 0's, as opposed to waveforms). An electric guitar and a piano are analog devices. And so on ....

Hope this helps a little. Time for some sleep. :moon:
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
All I can say about this is that I've got a Nokia 21" monitor, a 445Pro, and I love the quality - and Viewsonic purchased Nokia's monitor division, so I'd hope that some of that technical skill would have been trasnferred over.

The difference I see between the two monitors linked (second link is broken, you included the parenthases in the link) is that the G220f has a flat screen, and that the latter weighs a good bit more.
Both use a shadow mask. Both support the same maximum refresh rate, 180Hz, though this is only available at 640x480.

I like the G220f better, based on the specsheets.
- Flat screen - definitely better to have
- Supports higher refresh rates at every resolution, especially 1280x1024 and 1600x1200. You'll want high refresh rates at these resolutions



LCDs and other flat-panel technologies (OLED or whatever may come out) - I don't think I'll be buying those until they figure out some way of letting them run multiple resolutions without the inherent, awful interpolation artifacts we all know and love.

Originally posted by: NightShadeKW
I saw it for sale here: http://www.azatek.com/details.asp?iid=397 for $85, though I have no idea how reliable that place is. Is "Surplus Computers" reliable?

Anyhow, I take it the P260 moniter would work well for gaming purposes?
In regards to that, it is rated "Grade B." Here's what the site defines that as:

Grade-B
Grade B - Monitor is in Great Condition (most vendors would sell these as "A Grade"),
Has slight Cosmetic Blemishes, examples of which are (but not limited to):
Slight Scratches in case
Slight Scratches in the screen
Hairline Crack in Case/Base
Slight Screen Burn
Slight imperfections in the AGAS (Anti-Glare, Anti-Static) Coating etc.
Imperfections on the case are fine. Anything wrong with the screen will drive you nuts.


There is a lot less eye train with LCD's because there is no flicker as there is with a CRT. They also use less power. This may not make a huge difference with just one LCD, but it still uses less power. Also less distortion, less heat and radiation.
True, but if you have a monitor with a high refresh rate, you'll not notice flicker. I even know people that can't tell the difference between 60Hz and 85Hz. :shocked:
I see flicker below 85Hz. I run my monitor at 1024x768 @ 120Hz, and 1280x1024@85Hz for games. I can stay at the thing for hours without getting eyestrain. Unfortunately, I'm not granted the luxury of lots of time anymore to do this constantly, but still, my point remains valid. :)
Yes, LCD's do use less power, which does irritate me, as I can't stand LCD interpolation for different resolutions.
Less distortion - depends on your monitor. I've seen crappy CRTs, blurry and distorted, and I've seen excellent ones, like what I've got now - sharp, clear, and an excellent picture overall.
Less heat - goes with less power.
Radiation - also true, though it seems that the jury's still out on harmful effects of electromagnetic fields. EM radiation, sure, we get plenty of that, all the time. Radio waves, and visible light are both examples of EM radiation. :)
 

YOyoYOhowsDAjello

Moderator<br>A/V & Home Theater<br>Elite member
Aug 6, 2001
31,205
45
91
Originally posted by: Ken90630
Originally posted by: YOyoYOhowsDAjello
Do CRTs with DVI connections use the digital or analog portion of the DVI signal?

I believe a CRT with a DVI connector would only be compatible with an analog DVI-I signal.

As you prolly know, there are two types of DVI connectors: DVI-I (the connector can carry either digital or analog signals) and DVI-D (the connector can only carry a digital signal). I think that if a CRT used a digital signal input, it would then have to be converted to analog in order to be displayed (and a digital-to-analog converter would add to the cost). It seems to me this would be pointless, as it would be simpler and cheaper just to use an analog input on the CRT. :)

Having said that, I've been embarrassed a few times by assuming logical thought processes among mfgrs when it comes to computer products, so the above info is stated with, lets say, only 99% confidence. (For all I know, maybe Thermaltake makes a flourescent lime green CRT that takes a DVI-D input and uses a D-to-A converter inside. :laugh: )

Just a joke, Thermaltake fanboys. Put your flamethrowers away. :D

Ok, I assumed it was going to be using the analog part of the signal, but the statements above me made it sound like it was using the Digital portion so I wanted to hear from somebody that really knew.
 

biostud

Lifer
Feb 27, 2003
20,267
7,416
136
Originally posted by: KoolDrew
There are much more advantages to an LCd other then space saved ;)

but there's many benefits of CRT's too.

no native resolution
better viewing angles
better overall color reproduction

 

Tiamat

Lifer
Nov 25, 2003
14,068
5
71
Viewsonic CRT monitors suck. My roommate has had to cough up 200$ additional in shipping charges. The CRT would shut off by itself and not turn on for 2 weeks. Wierdest and most annoying problem. The replacements (2 of them) did the same thing!

LCDs provide less eye-strain and better contrast ratio.
 

ROJAS

Senior member
Oct 9, 1999
811
0
71
I have been using for the last two years a Samsung 213T LCD, 21.3 inch and its good for gaming as well. Not a low refresh rate like the new Lcd's but very viewable.
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Originally posted by: cscpianoman
DVI is better. DVI is a digital interface designed for LCD's which are digital. CRT's are analog, so you won't find any CRT's that have DVI ports.

My 21" IBM P260 CRT has a DVI port.

EDIT: Crap, I should read the thread before I post...
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Originally posted by: SonicIce
Originally posted by: dguy6789
Can't beat 2048X1536 at 75Hz and all lower resolutions at much higher.

http://www.azatek.com/details.asp?iid=510


/Thread.

$340 shipped when you can get a P260 for $200 shipped that has slightly less resolution. thread isn't over yet.

YOU IDIOT!!!! P275 = Best

Ok I'm done with that. But there are quite a few key differences between the P260 and P275. The main one is their power usage. P275 = 145w, P260 = 160w. I am not going to waste time trying to explain the others to you.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
I compared a 16" Diamondtron CRT to my SyncMaster 730B digital LCD, I had them both hooked up to the same computer, and played Half Life 2 on both of them. Here is what I think performs the best for different things.

motion blur: CRT
contrast: CRT
clarity: LCD
color reproduction/vividness: LCD
size: LCD
digital interface for new graphic cards: LCD
glare: LCD
modern look: LCD
easiness on your eyes: LCD

That's just what I think.
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Originally posted by: Crescent13
I compared a 16" Diamondtron CRT to my SyncMaster 730B digital LCD, I had them both hooked up to the same computer, and played Half Life 2 on both of them. Here is what I think performs the best for different things.

motion blur: CRT
contrast: CRT
clarity: LCD
color reproduction/vividness: LCD
size: LCD
digital interface for new graphic cards: LCD
glare: LCD
modern look: LCD
easiness on your eyes: LCD

That's just what I think.
You can easily get a big CRT or LCD unless you're referring to LCDs being thinner than CRTs
DVI doesn't matter as you can easily use a VGA adapter on new cards to use a CRT
Glare is a non-issue on my CRTs because I don't have them set up right across from a window and I have flat screen CRTs.
Looks aren't everything either...

 

NightShadeKW

Member
Jun 8, 2005
77
0
0
Okay, update. I got the Grade-A P260 from azatek for 200. When I first got it, I had a lot of trouble hooking it up (it didn't get a signal from my PC for some reason). I was just about to return it when I tried it on another computer and it worked. So I tried it again on my PC and it didn't work. I tried again, and it did, and it's been working since. I've been using it a couple of days, enjoying it, and all of a sudden the screen goes blurry/out of focus. I don't know what this means, but I turn the moniter off and back on and it's fixed. Over the next 30 minutes or so, it goes blurry some 5 more times, and I have to keep turning it on and off, sometimes several times before it's fixed. It's really blurry right now in fact. Is this a defective moniter that I should send back? Or is this some sort of weird fluke? What could be causing this?
 

uberowo

Member
Jul 5, 2005
104
0
0
NightshadeKW: you have any speakers or other devices with magnets in them close to your crt monitor? :D If so, that may very well happen. The monitor probably degausses automatically when you restart it, hence the blurring goes away. Might wanna move the speakers away from the monitor if thats the case. ;)
 

NightShadeKW

Member
Jun 8, 2005
77
0
0
Does a subwoofer count as a speaker? If so, that very well might be the problem since my moniter is on top of it. My regular speakers are right next to the moniter, but they aren't on right now (during the problem) and I would have thought that the problem would have existed the past few days if that's the problem. Also, I tried degaussing the moniter manually several times, and it didn't do anything.
 

uberowo

Member
Jul 5, 2005
104
0
0
My guess is that is does count yes.. And it doesnt matter if the speakers are on I think. I'm not really sure on the specifics, but regular speakers and crts do not work together. All speakers for use with computers are designed in a special way to avoid this exact problem, shielding its surroundings from the magnetic crap. Not sure on the technical terms in english I'm afraid, though I'm sure someone else here is. :)

EDIT: Try moving the speakers away and do a manual degauss, and you'll see if thats the problem.
 

uberowo

Member
Jul 5, 2005
104
0
0
Oh and the reason I checked out this thread in the first place:

Does anyone know of some good 21" CRT options thats NOT a Trinitron or that other type that also has the extremely gay black horizontal lines over the entire screen?