Computer forgets there is a lcd hooked up

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Newer AMD system running Windows 7.
GeForce 8400GS (just updated to 285.62)
Previously was using 280.26. Just your office special desktop.

I have a lcd via dvi and a LG TV via VGA. They are setup via clone which works fine but the problem is that the computer forgets about the monitor and refuses to turn it back on after turning the tv on. I can launch the Nvidia control panel and rerun the setup for cloning video and then it remembers there is a lcd. I poked around for power saving features and I'm just not seeing anything. The video card is working because there is video on the tv, just not on the lcd. Any ideas?
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
Is the monitor set to be the primary display? (Display #1?)
 

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Yes and no. Since they clone, clicking identify displays a 1 and a 2 on both monitors.
 

Meractik

Golden Member
Jul 8, 2003
1,752
0
0
Hello!!! I have a similar setup as you except I am using two tv's one on VGA and one on HDMI, I also use a Nvidia GPU - looking through my Nvidia control panel these are the settings I would mess with if i was you...
Untitled.jpg


I also recently learned that some video cards do not support simultaneous outputs via VGA/DVI at the same time... but since you can get them both to turn on just not stay on I will assume that is not the issue...
 

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Thank you but that did not appear to fix anything for good. My lcd was not listed and then it is after click Rigorous Display Detection, it is then listed and working without doing anything else. Rigourous returns "sorry unable to detect any older monitors...". I double checked and the monitor is setup to be #1.
 

Meractik

Golden Member
Jul 8, 2003
1,752
0
0
when the monitor appears after you select rigorous display detection, is it check marked? I am not 100% sure but I would assume that the check mark means to acknowledge both monitors on the system.....
 

amdskip

Lifer
Jan 6, 2001
22,530
13
81
Yes, check marked and working without doing anything else, very strange. I'm not sure if it is a monitor or video card issue. Have not swapped parts yet.
 

Meractik

Golden Member
Jul 8, 2003
1,752
0
0
I would assume it might be a hardware issue just from reading the Nvidia FAQ for your card. It looks like it might be the issue I brought up. The verb-age is very precise here...

http://www.nvidia.com/object/geforce_8400_Gs_faq.html

Q: What are the key features of the GeForce 8400 GS?
A:

NVIDIA unified architecture with GigaThread™ technology
Full Microsoft DirectX 10 Shader Model 4.0 support
True 128-bit floating point high-dynamic-range (HDR) lighting
NVIDIA Quantum Effects™ physics processing technology
One single-link DVI output, supporting one 1920x1200 resolution display
GeForce 8500 GT $89-$129
NVIDIA® PureVideo™ HD technology
HDCP Capable
PCI Express® support
OpenGL® 2.1 support
NVIDIA® ForceWare® Unified Driver Architecture (UDA)
Built for Microsoft Windows Vista

It very well might just support one display output at a time, which would explain why they're cloned, its to much for the card or its not able to output an extended desktop to two monitors. To confirm this I would try to setup extended display instead of cloned. It might just be able to do just as it says and only support one display at a time. Perhaps its trying to push to much to the secondary display?

hmmmm.... the more i read about this the more I believe it to just be that the DVI port itself being Single Link only allows one monitor output, aka no inserting a Y connector here and using two DVI monitors... (that would be DVI-D - Dual Link) Sooo.... since you're using VGA and DVI the card should work.... with both simultaneously... i will keep researching.... to look for more ways to help.. I understand you're using the cloning of the monitors but are you able to setup extended desktop? and if you do does the LCD still lose video?

Okay lets overturn this assumption I believe your video card does indeed support dual monitors through VGA/DVI per the specs listed on the amazon url - http://www.amazon.com/PNY-nVidia-Ge.../dp/B001D72NE0

so that rules out that possible issue.... moving on...
 
Last edited:

Meractik

Golden Member
Jul 8, 2003
1,752
0
0
Looking at my monitor configuration it looks as though the two TV's I use are both able to use the same resolution and windows sets them to match in its own display properties. If I was you I would attempt to go into the Windows 7 display properties and ensure both displays have the same resolution as each other...