My media computer was using a GeForce 7950GT to help with video playback. This is a great video card, but right now it's summer and my computer room gets incredibly hot, so I decided to take that card out and just use the integrated Radeon 2100 graphics. Right after doing this, the system ran like shit. Videos were dropping frames, and the computer would freeze for a few seconds every couple of minutes. It wasn't just a local problem; Remote Desktop sessions would freeze up which doesn't seem to make any sense since Remote Desktop doesn't even use the video card (or at least I don't think it does). Another weird problem was that Remote Desktop had really screwed up colors, like everything was a bit blue.
I put the GeForce 7950GT back in, and everything worked fine again. Does this sound like a problem with integrated vs discrete graphics, or does it sound like the integrated video is simply broken? I did the OCCT video test on the Radeon 2100 and it didn't have any detectable video errors.
I put the GeForce 7950GT back in, and everything worked fine again. Does this sound like a problem with integrated vs discrete graphics, or does it sound like the integrated video is simply broken? I did the OCCT video test on the Radeon 2100 and it didn't have any detectable video errors.