volts, amps, wattage and electronics

ZippyDan

Platinum Member
Sep 28, 2001
2,141
1
81
my experience with ac/dc power adapters has been this:

always try to match the volts/amp rating as closely as possible. but never go over on the amps. i've found that using a power adapter that supplies too many amps will often fry the device.

on the other hand, i've had no problems using say, a 15volt adapter when the device only calls for 10v - as long as the amps are close to correct

my friend who is an building electrician and is taking the A+ test ran across a test question that was basically:

When replacing an ac/dc power adapter, which is best?

A. An adapter with higher voltage
B. An adapter with same wattage
C. An adapter with higher amps
D. An adapter with lower amps

I said it should be A since I think amps are more important. He said the correct answer is supposed to be B since volts are THE most important. I countered that an adapter with the same wattage could have different volts AND amps because it is only giving you the product of the two. Further, he says that voltage is always delivered to the device and that too much will fry it, while amps are only available to the device and not necessarily delivered: the device pulls what it needs.

Anyway, ignoring the fact that I think "same wattage" should say "same voltage" if he is correct, my experience tells me he is full of BS - what is the real story here?
 

Red Squirrel

No Lifer
May 24, 2003
69,736
13,351
126
www.betteroff.ca
You want to not go over the voltage. The amps are not "given" to the device. They are drawn by the device. So you can use a 1000amp rated power source to power something that only needs 1 amp, as long as the voltage is the same. The device will only pull 1 amp. From a safety point of view you want to have slightly higher amperage but this applies more in high voltage situations.

Ex: Don't install a 40 amp plug for your dryer when it only needs a 30 amp one.

Now, you also don't want to under power your source. Ex: less amps or less volts. Volts should be as close as you can get, if the same, amps should be higher.

Computer power supplies are a good example of this. All of them give the same voltages, but the wattages are different, the only variable left is amps. A 1000w psu can supply more amps then a 200w psu, but it wont fry your components.
 

CycloWizard

Lifer
Sep 10, 2001
12,348
1
81
None of the above (maybe C?). The power supply must supply the specified voltage to the device, as well as being able to produce at LEAST as many amps as the device specifies. A supply which outputs the correct voltage but is capable of higher current/power outputs will not fry the device, since the device will never draw this extra current.
 

Aluvus

Platinum Member
Apr 27, 2006
2,913
1
0
Match the voltage, match or exceed the current rating. Your "experience" that supplies rated for a higher current will damage the device is incorrect and probably reflects some other factor that you did not realize at the time.

Of the possible answers given, (C) is the best (and even then, only assuming that the voltage matches). With that same assumption, (B) is also fine (but not as good).

(A) is not great, although a small discrepancy in voltage is generally fine (most devices have some onboard voltage regulation). (D) is similar; doubly so since the adapters that ship with devices are normally over-specified by a bit relative to what the device will actually draw in practice.
 

silverpig

Lifer
Jul 29, 2001
27,703
12
81
Look for stories of what happens if the voltage switch on the back of the PSU is set wrong.
 
May 11, 2008
21,712
1,302
126
Match the voltage, match or exceed the current rating. Your "experience" that supplies rated for a higher current will damage the device is incorrect and probably reflects some other factor that you did not realize at the time.

Of the possible answers given, (C) is the best (and even then, only assuming that the voltage matches). With that same assumption, (B) is also fine (but not as good).

(A) is not great, although a small discrepancy in voltage is generally fine (most devices have some onboard voltage regulation). (D) is similar; doubly so since the adapters that ship with devices are normally over-specified by a bit relative to what the device will actually draw in practice.


I totally agree.

ZippyDan must search for a replacement power supply with the same voltage where the maximum available amount of current from the replacement power supply must be equal or surpass the maximum output current of the original power supply.

The law : P = U * I.
Where P = power.
Where U = voltage.
Where I = current.

Therefore the replacement powersupply canl deliver an equal or larger amount of power with the same voltage. But this all depends on the load.

However, Sometimes only the input power consumption of the powersupply is written on the power supply. And then you must take into account the efficiency of the power supply. A thumbs rule is that with switching powersupplies the efficiency is at least 75%( And that is really a worst case scenario) meaning the maximum output power available is 75% of the input power.
 
Last edited:

ShawnD1

Lifer
May 24, 2003
15,987
2
81
However, Sometimes only the input power consumption of the powersupply is written on the power supply. And then you must take into account the efficiency of the power supply. A thumbs rule is that with switching powersupplies the efficiency is at least 75%( And that is really a worst case scenario) meaning the maximum output power available is 75% of the input power.

What I see a lot of the time is the sticker showing output power then it shows input current and voltage. It will look like this:
Max Output: 20W
Input Voltage: 120V
Current: 0.2A (at 120V)

You can see stuff like this if you look at your computer PSU. It will say something like Antec 540W, input of 120V at 5A. It often won't directly state the efficiency of it, and if it's a really crappy PSU you can't even guess at what the efficiency is because you don't know what the power factor is.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
As has been stated, the only two not-incorrect answers are B and C (C being the better choice). Matching wattage only goes so far. Too low of current and your system doesn't power up, too high of voltage and you fry components.

If the wattage is the same, then there is a possibility that the volts and amps are the same, but there is also a possibility that they aren't.

If the Amps are too high, nobody cares, power adapters are voltage sources, not current sources, so the amp rating is strictly a rating of the maximum current draw available. Leaving you with a potentially unknown voltage.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Really that is a bad question because ac/dc adapter can mean a lot of things.
It could be an adapter that uses transformer + diodes + cap = output , one like that ,which is common in cheap devices, will vary the output voltage with the load. So put a 1A load on it and it outputs 5VDC then put .5A and it outputs 9VDC.

Switching adapters can cause the same problem if they use the load to stabilize the output. Using the wrong load can cause the voltage to differ.

The reason that many people can get away with using something like a 12VDC adapter on a 5VDC device is mainly luck. Most devices convert the power to other voltages inside the device. So your router has an adapter rated 12VDC @ 1A but inside it is converted to 5V @ 2A . If someone uses a 15VDC adapter with that router it might work because the regulator inside can accept voltages from 9VDC - 18VDC. Just depends on the design and is not recommended unless you are willing to pop open the device and study how it does its conversion :)
 
Last edited:

larslake

Member
Sep 30, 2009
34
0
0
P over I/E , P=Watts,I=Amps,E=Volts - answer is 'B' relative to a correct replacement measured in a power rating (watts).
 

ussfletcher

Platinum Member
Apr 16, 2005
2,569
2
81
If you think of electricity as a pipe with water in it, voltage being pressure and amperage being fluid flow, then you can see why this answer makes sense.

For example if your device only uses 1A but the supply is 100A, its the same as having a garden hose attached to a water main, only so much flow can occur... However with more pressure (voltage) you would completely blow out the water house (circuitry)
 

aj654987

Member
Feb 11, 2005
117
14
81
From a safety point of view you want to have slightly higher amperage but this applies more in high voltage situations.

Ex: Don't install a 40 amp plug for your dryer when it only needs a 30 amp one.

Uh are you talking about breakers or fuses? Because for a plug/cord it shouldnt matter and should be rated higher than what needs to be drawn through it.
 

C1

Platinum Member
Feb 21, 2008
2,376
112
106
It is sort of a trick question. It is the classic test question type that is used on DMV tests. You are to select the "best" (or least wrong) answer. The answer is an incomplete statement of the situation, but it is still "more optimal" or fits best parametrically the facets or attributes of a known correct adapter. The purpose of these type questions is to force you to consider or analyze the problem in a wider scope using valid theory & application, so its testing your understanding of P=IE and the relative importance of I versus E. Even though "B" is the correct answer for this test, in practical application selection of an adapter of the correct voltage & having a rating of more output amps normally is satisfactory from an operational standpoint. However, providing a customer with a resultant physically larger/heavier adapter because it is more capable may be a turn-off (ie, not the best match).
 

Red Squirrel

No Lifer
May 24, 2003
69,736
13,351
126
www.betteroff.ca
Uh are you talking about breakers or fuses? Because for a plug/cord it shouldnt matter and should be rated higher than what needs to be drawn through it.

Yeah guess I worded that a bit confusing. Meant your source should be able to provide a bit more amps then what the appliance needs as if it's exactly enough you are using 100% of it. But yes the wire needs to be rated for that amperage. Don't over rate either. I forget the rule, but think you should stay below 80% of the actual rated load. I don't know if it's a code, or just a rule of thumb though, and it applies mostly to continuous loads. Best bet is just use whatever the appliance asks for.
 

drinkmorejava

Diamond Member
Jun 24, 2004
3,567
7
81
Really that is a bad question because ac/dc adapter can mean a lot of things.
It could be an adapter that uses transformer + diodes + cap = output , one like that ,which is common in cheap devices, will vary the output voltage with the load. So put a 1A load on it and it outputs 5VDC then put .5A and it outputs 9VDC.

Switching adapters can cause the same problem if they use the load to stabilize the output. Using the wrong load can cause the voltage to differ.

The reason that many people can get away with using something like a 12VDC adapter on a 5VDC device is mainly luck. Most devices convert the power to other voltages inside the device. So your router has an adapter rated 12VDC @ 1A but inside it is converted to 5V @ 2A . If someone uses a 15VDC adapter with that router it might work because the regulator inside can accept voltages from 9VDC - 18VDC. Just depends on the design and is not recommended unless you are willing to pop open the device and study how it does its conversion :)

I wouldn't say it's necessarily luck, and you wouldn't need a regulator either. Any well designed product should have voltage limiter on the input. I'd never actually try using a high voltage power supply without looking, but you have a good chance of it working. It's amazing how a few diodes can make life so easy.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I wouldn't say it's necessarily luck, and you wouldn't need a regulator either. Any well designed product should have voltage limiter on the input. I'd never actually try using a high voltage power supply without looking, but you have a good chance of it working. It's amazing how a few diodes can make life so easy.

Most consumer products have zero protection from over voltage . Most are power input - > fuse - > power regulation. To keep cost low most will use a 16V capacitor on a 12V input, so if you put 20V on that input the capacitor blows, the power regulation shorts out then finally the fuse blows. The way they see it is they ship the product with the correct adapter and if you use something else then it is your problem.
 

mindless1

Diamond Member
Aug 11, 2001
8,626
1,687
126
Really that is a bad question because ac/dc adapter can mean a lot of things.
It could be an adapter that uses transformer + diodes + cap = output , one like that ,which is common in cheap devices, will vary the output voltage with the load. So put a 1A load on it and it outputs 5VDC then put .5A and it outputs 9VDC.

Switching adapters can cause the same problem if they use the load to stabilize the output. Using the wrong load can cause the voltage to differ.

The reason that many people can get away with using something like a 12VDC adapter on a 5VDC device is mainly luck. Most devices convert the power to other voltages inside the device. So your router has an adapter rated 12VDC @ 1A but inside it is converted to 5V @ 2A . If someone uses a 15VDC adapter with that router it might work because the regulator inside can accept voltages from 9VDC - 18VDC. Just depends on the design and is not recommended unless you are willing to pop open the device and study how it does its conversion :)

THIS is the heart of the issue. Manufacturers will spec the lower cost PSUs with minimal current needed to save money, vs the warranty replacement costs.

Thus, you're usually better off going with a little higher current replacement PSU (and/or higher quality but sometimes quality is harder to gauge until you have it in hand for examination) if it is regulated, but staying with the same current rated PSU if unregulated (which is easy enough to check, check voltage when it has no load and if that voltage is approaching [spec'd voltage * 1.41] - 1.4V), you have a typical unregulated PSU, like most of the old-school wall warts that have high weight density. Even the typical low end regulated PSU won't come close to the output of the prior formula without the product powered as a load.

Either way you should use the same voltage replacement PSU, however there is some wiggle room with that, sometimes a bit higher or lower voltage PSU could be used ok IF you know the circuit and margins of the design... but if you do know enough to know that, you probably don't need anyone to tell you what'll work.
 
Last edited: