• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why did the AGP spec evolve to lower voltages when cards require more juice than ever?

CZroe

Lifer
Why did the AGP spec evolve to lower voltages when cards require more juice than ever?

We've probably all ran into the voltage limitation at some point. In my case, it was trying to upgrade to a P4 1.3GHz system back when my main card was a 3dfx card 🙂 I think the D850GB was the first board that no longer supported 3.3v cards. My old PIII system was fried so I didn't have a functioning card for my new setup until I bought a Visiontek GF3 on the release date (Then I gave my friend his AIW Radeon back 😉).

Anyway, even then we saw the 3dfx V5 5500 require a direct PSU connection, and the upcoming yet later canceled V5 6000 would require its own [external] PSU!

So life went on at 1.5v until nVidia and ATI chips got as power-hungry as the ancient 3dfx hardware and we're stuck with an even more underpowered AGP slot (Actually, it makes you wonder if a Radeon 9700 Pro or nv30 wouldn't need a PSU connector if the AGP slot still provided 3.3v...)

I am aware that volts do not translate directly into watts, but what reason could there be for lowering it? It's not like there is a such thing as a ".13 micron AGP trace" or something to overheat from the higher voltage. It's not like cards that didn't require so much couldn't just step the voltage down themselves like all dual-voltage cards have all along. Or does the AGP slot's power source come straight from the ".XX micron" motherboard chipset? An AGP Pro slot's extra power connections certainly don't.
 
Voltage is not the same as wattage and the latter is what video cards use more and more of.

In fact a lower voltage usually means that the supplied wattage can be higher so if anything, things are improving.
 
lower voltage = cooler for the same amount of wattage. CPUs used to run at 5V and are now down to 1.5v, GPUs are the same.
 
It takes a finite amount of time for a transistor to transition from the on to the off state (and vice versa). By lowering the operating
voltage, switching times may be reduced and faster operation may be achieved.

Generally, increasing the speed of digital electronics means more heat. This is because the capacitance of the transistor gates becomes
more and more significant. (reactance is prop. to frequency) Greater reactance, means more loss which is in the form of heat.
Another way to see this is understanding that faster operating speeds makes the transistors spend a higher proportion of their time in
the linear region of operation (as opposed to cutoff or saturated) where losses are larger => more heat.

The loading (current requirements) keeps increasing with design generations because of the increased complexity of the designs (more transisitors) which means there is more "places for current to go".

For a given design, the current will increase with an increase in voltage (Ohms' law) and you see more heat as a result.

-Sid

bacillus is absolutely right, power=VI so for increased voltage and/or current, power is increased. However, these designs are not regulating
to a constant power level. so decreasing the voltage will also lead to decreasing current and lower temps IF ALL ELSE IS THE SAME.
 
Originally posted by: BFG10K
Voltage is not the same as wattage and the latter is what video cards use more and more of.

In fact a lower voltage usually means that the supplied wattage can be higher so if anything, things are improving.

Like I said, I am aware that voltage doesn't equal lower watts. That's not what confuses me.
Watts = Amps * Volts
I don't see why they would lower voltages and raise ampers though.

Originally posted by: DaveSimmons
lower voltage = cooler for the same amount of wattage. CPUs used to run at 5V and are now down to 1.5v, GPUs are the same.

This is what I meant by "It's not like there is a such thing as a '.13 micron AGP trace'" Honestly, do you think they lowered it because the slot itself was giving off too much heat? 😀 Edit, Oh, I see now. You're talking about the GPU's voltage. Well, nearly all cards even today are designed to operate at both voltages meaning the GPU's core voltage is not directly derived from the AGP bus. This could easily continue, expecially today where any extra power could still be utilized.

Originally posted by: Insidious
It takes a finite amount of time for a transistor to transition from the on to the off state (and vice versa). By lowering the operating
voltage, switching times may be reduced and faster operation may be achieved.

Generally, increasing the speed of digital electronics means more heat. This is because the capacitance of the transistor gates becomes
more and more significant. (reactance is prop. to frequency) Greater reactance, means more loss which is in the form of heat.
Another way to see this is understanding that faster operating speeds makes the transistors spend a higher proportion of their time in
the linear region of operation (as opposed to cutoff or saturated) where losses are larger => more heat.

The loading (current requirements) keeps increasing with design generations because of the increased complexity of the designs (more transisitors) which means there is more "places for current to go".

For a given design, the current will increase with an increase in voltage (Ohms' law) and you see more heat as a result.

-Sid

bacillus is absolutely right, power=VI so for increased voltage and/or current, power is increased. However, these designs are not regulating
to a constant power level. so decreasing the voltage will also lead to decreasing current and lower temps IF ALL ELSE IS THE SAME.

This throws a monkey wrench into all I've been led to believe. I understand that the more clock cycles a CPU goes through the more time transistors spend switched on and therefore radiating heat, but it's been commonly known in the overclocking community that increasing the voltage, while also raising the heat output, will yeild higher operating frequencies. Voltage has been described to me as the "push" or "pressure" behind the current. It has also been said that it makes more "defined bits" because the difference between on and off states is now greater. However, none of this has any relevance to a 66MHz AGP1x/2xdata bus being either 3.3 or 1.5 volts. The voltage was not dropped to lower heat output. Because the bus speed is primarily limited by the distance of the traces, I doubt it was used to raise frequencies.



I am starting to wonder, was it only the data bus that dropped to 1.5v and not the power traces? I never even thought about that possibility. Could the power traces have been something different all along?
 
when overclocking, the higher voltages help because the signal gets to the on threshold value (something less than supply volts) sooner
in time. (remember it is a sloping voltage vs. time waveform.)

overclocking without a corresponding voltage increase will eventually lead to data being lost because the signal voltage at a transistor gate
will not make it up to the on threshold value in time to be read correctly.

-Sid

edit: Believe me, when a transistor is fully on (saturated) it's resistance is at it's lowest. since P=I*I*R, the power dissapation is lower when the transistor is fully on than when it is in the linear region of operation (partially on => higher resistance)
 
Originally posted by: Insidious
when overclocking, the higher voltages help because the signal gets to the on threshold value (something less than supply volts) sooner
in time. (remember it is a sloping voltage vs. time waveform.)

overclocking without a corresponding voltage increase will eventually lead to data being lost because the signal voltage at a transistor gate
will not make it up to the on threshold value in time to be read correctly.

-Sid

edit: Believe me, when a transistor is fully on (saturated) it's resistance is at it's lowest. since P=I*I*R, the power dissapation is lower when the transistor is fully on than when it is in the linear region of operation (partially on => higher resistance)

Ah, the other overclocker's analogy: "The slowest circuit in the chip" Many people associate the "pressure" analogy as the reason in helps there too (High-pressure = faster flowing/switching circuit). Not that I know what you're talking about or anything 😕
 
I am starting to wonder, was it only the data bus that dropped to 1.5v and not the power traces? I never even thought about that possibility. Could the power traces have been something different all along?
I think you're right on the money about the power traces remaining the same while the data traces were lowered. The power traces are probably 3.3v, 5v, and 12v.
 
Back
Top