New Build: Had but then lost video signal

100Core

Member
Mar 8, 2009
71
0
0
Hi All,

Appreciative of any help I can get here. Long time builder and just put one together (parts below).

I was posting fine and was just getting the bios settings where I wanted when all of a sudden I lost video signal after a reboot. The only setting changed was disabling CMS as I was preparing to set up a UEFI RAID 0. It doesn't make sense to me that changing that would cause me to lose video signal, so I don't know if it is actually related


Ideas on what would cause losing video after having it for over 1 hour? I went through the standard checklist and there don't seem to be any obvious issues. I'm going to take out most peripherals now and see if I can get it back.

i7-4970k @ 4.0
Gigabyte z97-ud5h
8gb gskill sniper @ 2133
2x Sapphire R9 290
2x Samsumg 840 EVO 250gb
WD Black 2TB
Rosewill 1000W

Thanks in advance for any ideas/advice!

Will
 

vailr

Diamond Member
Oct 9, 1999
5,365
54
91
Use the Gigabyte Dual-Bios switch to attempt booting from the backup bios.
Or else: try booting with only one video card present, or even with no video card, and see if the Intel GPU is working. If the bios was set to have the Intel GPU as the primary boot video device, there may be a loss of signal from any discrete video card.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Unless your video cards have a "UEFI GOP-compliant" BIOS, then yes, disabling CSM boot would prevent your video card(s)' BIOS from initializing, and thus you received no display output.
 

100Core

Member
Mar 8, 2009
71
0
0
Thanks all for the replies, and sorry for the delay, I got pulled away from this project by the wife.

I reluctantly cleared CMOS and did get video back, it was definitely the CMS setting. It's been a couple of years since my last build and I'm not very familiar with this windows 8 / uefi stuff, but I am up and running with 8.1 installed and things fairly stable.

Not that stable though, I completely shut down when starting up video-heavy gaming. I'm running crossfire R9 290. I'm pretty sure 1000w is enough from my psu, and taking heat readings up until the shutdown, the GPU only gets to 55C, so heat doesn't seem to be the issue. Unless it's the CPU? Didn't log it during the shutdown but they seem quite low at idle (30-40C).


Questions also remain on CMS. I've confirmed that my GPUs are UEFI/GPT compatible (listed above). The mobo manual is actually what prompted me to turn CMS to "never" as the first step of setting up a RAID-0 in a UEFI environment. I had to leave it to "always" and the raid is working fine, but I suspect that I'm missing out on some faster post times. Not particularly concerned with that, but I feel like I do not have those windows 8 settings where they are supposed to be. Thoughts on this are appreciated.

Thanks again,
Will
 

inachu

Platinum Member
Aug 22, 2014
2,387
2
41
This may be a stupid question but when you did your system build.... Did you install the video driver first after the OS got installed? Or did you install the motherboard drivers first?


Doing one or the other means the video driver would be grabbing a different set of IRQ's.
 

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
This may be a stupid question but when you did your system build.... Did you install the video driver first after the OS got installed? Or did you install the motherboard drivers first?


Doing one or the other means the video driver would be grabbing a different set of IRQ's.

There's really no such thing as a "motherboard driver" anymore, at least not in the PCI controller sense that you're referring to. Certainly there are drivers for built-in peripherals like network and audio controllers, but those are really just PCIe devices which happen to be hardwired onto the board. All of the PCI controller drivers are standard and built into the OS. The Intel INF Update utility does exactly that, updates the INF files so that your Device Manager will show the official Intel-branded names instead of generic device names. The same driver is used in either case.

With PCIe and it's mandatory MSI, there is really no need to worry about IRQs anymore, they certainly play no role in device enumeration.

The OP's problems occur long before he gets to the point of the OS loading drivers. If you can't even see the BIOS (UEFI), then it's not an OS problem.
 
Last edited:

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
Not that stable though, I completely shut down when starting up video-heavy gaming. I'm running crossfire R9 290. I'm pretty sure 1000w is enough from my psu, and taking heat readings up until the shutdown, the GPU only gets to 55C, so heat doesn't seem to be the issue. Unless it's the CPU? Didn't log it during the shutdown but they seem quite low at idle (30-40C).

A complete shutdown under load is a classic symptom of a PSU over-current (or related) protection circuit being tripped. What exact PSU do you have?

Questions also remain on CMS. I've confirmed that my GPUs are UEFI/GPT compatible (listed above). The mobo manual is actually what prompted me to turn CMS to "never" as the first step of setting up a RAID-0 in a UEFI environment. I had to leave it to "always" and the raid is working fine, but I suspect that I'm missing out on some faster post times. Not particularly concerned with that, but I feel like I do not have those windows 8 settings where they are supposed to be. Thoughts on this are appreciated.


I'm assuming you mean CSM (Compatibility Support Module) mode. My short answer to you is that UEFI boot on self-integrated hardware is a complete crapshoot and you're going down a rabbit hole with little to no reward. RAID0 boot will work fine in legacy mode, just leave it be.

The long answer is that just because the GPU is capable of UEFI boot doesn't mean that the default GPU BIOS is enabled for it, nor does it mean that the devices are being enumerated in the order that you expect. Adding Crossfire to the mix adds to the complexity. First thing would be to try each video output (also try the IGP) during boot, and see if any of them give you a signal. Second would be to search for a UEFI-enabled GPU BIOS (sometimes these are only given out to system integrators).

As a side note, why do you want to boot from RAID0 in the first place? A standard gaming build does not benefit from the increased performance at high queue depths which RAID0 SSDs gives you. The more typical low queue depth performance is limited by latency, which is the same or higher in a RAID setup.
 

100Core

Member
Mar 8, 2009
71
0
0
This is the kind of solid advice I come back to AT for. Thank you.

I was afraid it was the PSU, which admittedly is the only non-new component in this build. Although I hadn't had any problems with it previously, it's about 5 years old; I ordered a new one. (Cooler Master g2 silent 1000W silver, 50% off for $99 AR on newegg, good enough).

I will take your advice on UEFI and just leave it. I think it's actually set up now, was just wondering how to optimize but it's a really fast boot anyway.

I'm running a RAID0 because gaming is only half the story for this rig. This will be the main office comp too, and my creative hub. Plus I've always run raids, really quick read/writes give one of the most noticeable jumps in zippyness for all sorts of things in my experience. The jump from 500mb/s to 1gb/s is probably beyond what can be detected, but I need the extra storage space anyway.. seems like the better choice over 1 bigger ssd.

Thanks again, definitely going with the suggestions.

Will

ps I need to update my really old signature
 

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
I was afraid it was the PSU, which admittedly is the only non-new component in this build. Although I hadn't had any problems with it previously, it's about 5 years old; I ordered a new one. (Cooler Master g2 silent 1000W silver, 50% off for $99 AR on newegg, good enough).

The reason I asked about the model was so that I could look up the rail configuration. It might simply be that you are overloading one of the 12V rails.

Anyway, the Cooler Master Silent Pro M2 (no Cooler Master G2 silver rated PSU that I saw) is a single-rail design, so you won't have that problem.

I'm running a RAID0 because gaming is only half the story for this rig. This will be the main office comp too, and my creative hub. Plus I've always run raids, really quick read/writes give one of the most noticeable jumps in zippyness for all sorts of things in my experience. The jump from 500mb/s to 1gb/s is probably beyond what can be detected, but I need the extra storage space anyway.. seems like the better choice over 1 bigger ssd.

Like I said, RAID0 SSDs don't help with performance at low QD because latency is the primary factor. Unless you're doing something really unusual, I don't see you pushing high queue depths.

All you're really buying yourself with RAID0 SSDs is reduced reliability and higher latency (extra CPU cycles for mobo FakeRAID). Since you've already bought two drives, my advice would be to run standard AHCI and segregate the workload at the application level.
 

100Core

Member
Mar 8, 2009
71
0
0
So after installing the new 1000w psu (indeed the CM Silent Pro M2), I'm still getting instant shutdowns when gaming. Sometimes right when the game is launched and sometimes 1-10 minutes in. They are much less frequent when I turn off crossfire of the r9 290s, so that's how I'm running it mostly, but it still happens occasionally. It also seemed to help to take the case panels off, so heat made it worse, but with both GPUs running it still fails every time even with case completely open. If this is a heat issue how can I fix it? Is it a bad GPU? Just a quick side note that this system has never been over-clocked in any way. Thanks in advance for any help here. P.S ignore the outdated signature, not sure how to change it on mobile.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
From most of your symptoms I would guessed a thermal thing or a PSU thing.

But due to my recent experience with a Gigabyte motherboard (and ongoing cynicism), I'm gonna put my 2 cents on a RAM instability due to a Vdroop type situation. I've had to overvolt my RAM a little.
 

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
So after installing the new 1000w psu (indeed the CM Silent Pro M2), I'm still getting instant shutdowns when gaming. Sometimes right when the game is launched and sometimes 1-10 minutes in. They are much less frequent when I turn off crossfire of the r9 290s, so that's how I'm running it mostly, but it still happens occasionally. It also seemed to help to take the case panels off, so heat made it worse, but with both GPUs running it still fails every time even with case completely open. If this is a heat issue how can I fix it? Is it a bad GPU? Just a quick side note that this system has never been over-clocked in any way. Thanks in advance for any help here. P.S ignore the outdated signature, not sure how to change it on mobile.

Try physically pulling one of the cards out instead of disabling Crossfire in software. If it's stable, then try swapping in the other card. If they're both stable in single-card configurations, then you can guess that it's heat related. If you can isolate the failures to one card, then you'll know you have a bad card.