New monitor (3 months) - possible failure?

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
I've always left my computer on at all times, turning only my monitor off and on when I want to use it. Recently, however, in an effort to save on electricity bills, I've been shutting my computer off at night and booting it up when I use it during the day, etc. 2 days ago when I was booting it up in the morning I noticed I wasn't getting any video signal. Rebooted a couple times, nothing. Turn my monitor off and on a few times, and eventually it just worked. I didn't think much of it.

Then today, I turn my computer on, turn my monitor on, and no signal. I've tried rebooting, turning my monitor off and on, and still nada. I tried a different DVI cable, and nothing. All I get is the "no signal" warning on the screen. How do I know my graphics card is working? Because I'm currently using my 2nd monitor (42" 1080p HDTV) and it's working fine.

Although, I did notice that as soon as I plugged it in (DVI-HDMI) it was running at 1280x720 and I'm not quite sure why. My 22" monitor runs at 1680x1050 so shouldn't it have booted up in that resolution regardless of whether the monitor was working or not - that's what I shut it down in. Just because I plugged in a different monitor it wouldn't automatically change - would it? It's not behavior I've noticed before. Aside from this weird thing, I would have simply thought my 22" monitor went, but now I'm not completely sure as it exhibited no behavior as such until that little quirk 2 days ago and now this.

Just looking for some opinions. Probably going to try to borrow a VGA-VGA cable from someone I know today to attempt to get video out from my laptop to my monitor to see if it's perhaps the DVI port on my monitor.
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: robisbell
what's the room temp before you start up the pc?

I have no direct way of measuring, but I'd say somewhere around 72-73F at max. What would it matter?
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
okay, this is a lcd monitor right?

you tried reseating the video card and made sure if it has a power plug that it's plugged in, right?
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: robisbell
okay, this is a lcd monitor right?

you tried reseating the video card and made sure if it has a power plug that it's plugged in, right?

It is indeed an LCD. To be honest, no, I didn't try reseating it nor checking the power to it because I've been using it for a few hours using my TV as a monitor without any problems so I assumed it was fine. I did, however, recall that I had a VGA-VGA cable in my laptop bag that I use for my laptop and other display devices. So I plugged my laptop into the D-SUB on my 22" monitor and it works fine. So it seems the problem is either the DVI-in port on the monitor or something in the works of the digital display rather than the entire thing (as the analog d-sub works). Long story short, bummer.

I do have a ViewSonic 20" widescreen at home that I gave to my father when my 22" came in (got it cheap, off brand Chi Mei). It seems that going cheap this time was a bad move. I'm just going to take it home and give it to him (since he only use analog) and go back to my old 20" - which, to be honest, was a better monitor in every way but the extra 2 inches. Now I just have to use my TV as a monitor till Thanksgiving. :p
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
that telss me the cards the issue if your laptop runs the monitor fine. I'd not swap out, the monitor is not the issue.
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: robisbell
that telss me the cards the issue if your laptop runs the monitor fine. I'd not swap out, the monitor is not the issue.

The monitor runs fine with my laptop as an analog source via D-SUB.
The monitor returns no signal when my desktop is the digital source via DVI.
My TV runs fine with my desktop as a signal via DVI (the same port used for my monitor).

I feel that this indicates it ISN'T the card and that it is in fact the DVI port on the monitor. Is there any way the card could run the TV fine and not the monitor? They're essentially both the same - they're both LCD monitors with DVI in. I'll add that I'm not completely ruling out some aberration of the card because at one point 2-3 years ago the card did overheat (the fan on the GeForce 6800 got clogged with dust) and there was some artifacting until I removed the fan casing and cleaning it before replacing it. I've watched the temp ever since and never noticed a problem (up to a max of maybe 71C during heavy load for hours - idling around 55C usually). I just don't understand how it could be the card if the TV works fine.
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
"My TV runs fine with my desktop as a signal via DVI (the same port used for my monitor)."

I must have missed that, good point. what's the reolution and refresh rate you ran with the tv?
did you try connecting a non-pc dvi input into the monitor? I had one that for some reason locked itself into a setting below what my card could do, and had to connect it to a non pc dvi input to get it to reset.
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: robisbell
"My TV runs fine with my desktop as a signal via DVI (the same port used for my monitor)."

I must have missed that, good point. what's the reolution and refresh rate you ran with the tv?
did you try connecting a non-pc dvi input into the monitor? I had one that for some reason locked itself into a setting below what my card could do, and had to connect it to a non pc dvi input to get it to reset.

I've actually been wondering if something like that may have happened. As soon as I rebooted my computer 2 or 3 times without getting signal on my monitor I, without changing anything else, attached my PC DVI to my TV (1080p/60). As soon as I turned my TV on, it was running at 1280x720, which my TV handled fine as 720p/60. I then changed the resolution to 1920x1080 which my TV's native 1080p/60. The problem is I can't now change the resolution to 1680x1050 (my monitor's default resolution) because the system realizes that my TV can't handle that resolution. I don't know why my computer would have defaulted to 1280x720 when my TV was plugged in, it's not like plugging in a device will change a resolution. I'm wondering if somehow there's just a resolution setting that my computer and monitor don't agree on. However, when I plugged in the d-sub from my laptop, it ran 1680x1050 without a problem.
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
what's the lowest resolution and refresh rate the monitor will take?
what's the lowest resolution and refresh rate the video card can output?
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: robisbell
what's the lowest resolution and refresh rate the monitor will take?
what's the lowest resolution and refresh rate the video card can output?

I'll have to look it up/spend time on Friday after work/my exams are over.
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
okay. I just get the feeling that's the issue. I seem to remember that the dvi and vga port go through the same chip. I just think the monitor and video card are on different settings for some reason.
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Using my TV as a monitor, I set the resolution of my computer to 1600x900, a resolution I know my card and monitor can handle (tho my monitor would stretch it with WS auto-adjust disabled). I still get no signal from my monitor. Even if Windows were set to a mode my monitor isn't capable of displaying, the boot screen should still come up as it's in what, 640x480 and absolutely something I know my monitor is capable of displaying (as I've seen it numerous times). I've also used my laptop plugged into my d-sub on my monitor to get into the OSD and restore factory settings to the monitor. Still don't get anything from my desktop via DVI. It seems like swapping out monitors is really my only choice. I don't know what else I can try, I'm trying to think of a friend who has a DVI capable graphics card so that I could simply test the monitor on his computer but I'm not sure how many gamer friends I have here at school. Pretty sure the DVI port on my monitor is simply out. I'd rather just downgrade size and get my old ViewSonic 20" out than spend 300 on a new 22" widescreen (samsung).
 

robisbell

Banned
Oct 27, 2007
3,621
0
0
I'd RMA it, heck I'd take it back to wherever you purchased it and swap it for a comparable sized one.