How can I tell what voltage my video card is?

GrmReeper

Junior Member
Apr 4, 2002
12
0
0
I was am upgrading to a new mobo and cpu. The mobo manual says to only use video card that is 1.5v.

The card I have is an ASUS V7700 GeForce2 GTS AGP2x/4x.

How can I tell what voltage my card uses?

G:confused:
 

LiLithTecH

Diamond Member
Jul 28, 2002
3,105
0
0
Generally if the card is AGP 2x/4x compliant, it supports both
3.3v and 1.5v.

Plus the card alignment notches are different (they should line up with the I/O slot).
 

Lord Evermore

Diamond Member
Oct 10, 1999
9,558
0
76
The 2X compliance isn't what's important, it's the 4X. Most 2X-only video chips only support 3.3V. 4X chips (which are automatically backward-compatible to 2X and 1X) will support both. The Geforce2 GTS is a 4X chip at 1.5V normally, but can work on a 2X motherboard with 3.3V.

This means that a motherboard that only supports 1.5V can't use a 2X-only video card usually. Some 2X cards however had the notches keyed so they would fit in a 1.5V slot, despite not being able to run in that type of slot (a Universal keying on the card, which should ONLY be allowed on a 4X card so that it can fit in 3.3V or 1.5V slots). This is the ONLY reason that there has ever been a problem with voltages, because some card-makers didn't comply with the spec exactly.

A motherboard that supports 1.5V should always support any 4X-capable video card, since all 4X cards work at 1.5V.

Newer AGP 8X boards will be damaged if a 2X card is installed, too, unless the board maker integrates protection circuitry as many do.