CZroe
Lifer
Why did the AGP spec evolve to lower voltages when cards require more juice than ever?
We've probably all ran into the voltage limitation at some point. In my case, it was trying to upgrade to a P4 1.3GHz system back when my main card was a 3dfx card 🙂 I think the D850GB was the first board that no longer supported 3.3v cards. My old PIII system was fried so I didn't have a functioning card for my new setup until I bought a Visiontek GF3 on the release date (Then I gave my friend his AIW Radeon back 😉).
Anyway, even then we saw the 3dfx V5 5500 require a direct PSU connection, and the upcoming yet later canceled V5 6000 would require its own [external] PSU!
So life went on at 1.5v until nVidia and ATI chips got as power-hungry as the ancient 3dfx hardware and we're stuck with an even more underpowered AGP slot (Actually, it makes you wonder if a Radeon 9700 Pro or nv30 wouldn't need a PSU connector if the AGP slot still provided 3.3v...)
I am aware that volts do not translate directly into watts, but what reason could there be for lowering it? It's not like there is a such thing as a ".13 micron AGP trace" or something to overheat from the higher voltage. It's not like cards that didn't require so much couldn't just step the voltage down themselves like all dual-voltage cards have all along. Or does the AGP slot's power source come straight from the ".XX micron" motherboard chipset? An AGP Pro slot's extra power connections certainly don't.
We've probably all ran into the voltage limitation at some point. In my case, it was trying to upgrade to a P4 1.3GHz system back when my main card was a 3dfx card 🙂 I think the D850GB was the first board that no longer supported 3.3v cards. My old PIII system was fried so I didn't have a functioning card for my new setup until I bought a Visiontek GF3 on the release date (Then I gave my friend his AIW Radeon back 😉).
Anyway, even then we saw the 3dfx V5 5500 require a direct PSU connection, and the upcoming yet later canceled V5 6000 would require its own [external] PSU!
So life went on at 1.5v until nVidia and ATI chips got as power-hungry as the ancient 3dfx hardware and we're stuck with an even more underpowered AGP slot (Actually, it makes you wonder if a Radeon 9700 Pro or nv30 wouldn't need a PSU connector if the AGP slot still provided 3.3v...)
I am aware that volts do not translate directly into watts, but what reason could there be for lowering it? It's not like there is a such thing as a ".13 micron AGP trace" or something to overheat from the higher voltage. It's not like cards that didn't require so much couldn't just step the voltage down themselves like all dual-voltage cards have all along. Or does the AGP slot's power source come straight from the ".XX micron" motherboard chipset? An AGP Pro slot's extra power connections certainly don't.