Trying to understand the volts required for a specific AGP video card to see if my existing power supply might be too low to run the card. There are 2 cards that I'm interested in: a GeForce FX 5500, and a GeForce FX 6200. I'm not so interested in the recommended watts for a power supply unit. When I go to the specs for these two cards, this is what's recommended:
"Minimum recommended power supply with +12 Volt current rating of 18 Amp."
I know the specs for the existing power supply in my computer indicates the following:
+3.3V@15A
+5V@11A
+12V@5A
+12.8V@7.5A
-12V@0.15A
+5VSB@3A
If I don't have a power supply that indicates at least a +12V current rating of 18 amps, the video card won't work? What is the real significance of recommended amps?
I'm concerned for a few reasons, but mostly because I have a proprietary power supply unit that will be very tough to replace.
"Minimum recommended power supply with +12 Volt current rating of 18 Amp."
I know the specs for the existing power supply in my computer indicates the following:
+3.3V@15A
+5V@11A
+12V@5A
+12.8V@7.5A
-12V@0.15A
+5VSB@3A
If I don't have a power supply that indicates at least a +12V current rating of 18 amps, the video card won't work? What is the real significance of recommended amps?
I'm concerned for a few reasons, but mostly because I have a proprietary power supply unit that will be very tough to replace.