How can I measure current output without a multimeter?

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Problem: I have a projector with an LED chip lamp. I want to measure the current output of the projector's power supply and thus ascertain the LED's wattage.

I can't really measure the current from the power supply directly because my multimeter is teling me 0 amps (possibly because it draws too much current and the power supply automatically shuts off when it's shorted.) I don't have a Kill-A-Watt.

1. How can I measure current output without a multimeter? One unscientific method is to light the LED using a known power supply and compare it to the brightness of the unknown power supply. I also don't have a light meter

2. How much is it okay to overload an LED chip lamp? Like if it's rated for 20W can I give it 30W with enough cooling?
 

Fardringle

Diamond Member
Oct 23, 2000
9,188
753
126
Does the projector manufacturer not list that information in the device specs/documentation? Or do you not trust their information and want to measure it on your own?

I'm not sure that you can get an accurate measurement unless you can trick the projector's power supply into thinking that the LED lamp is installed when it actually isn't there. Or devise something that will sit between the power supply and the LED to measure the current as it passes through to the LED.
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
...I can't really measure the current from the power supply directly because my multimeter is teling me 0 amps (possibly because it draws too much current and the power supply automatically shuts off when it's shorted.)...
Am I interpreting this wrong, or did you set your multimeter to measure current, then connect it directly across the output of the PSU? If you did, then you will have blown a fuse in the multimeter, and it will read 0 until you replace it. You need to connect it in series with the load to measure current.

1. Put a small resistor in series with the load, measure the voltage across it, and use Ohm's law to work out the current.

2. To a certain extent yes, but there's a limit to how cool you can keep an LED due to its small size.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Am I interpreting this wrong, or did you set your multimeter to measure current, then connect it directly across the output of the PSU? If you did, then you will have blown a fuse in the multimeter, and it will read 0 until you replace it. You need to connect it in series with the load to measure current.

Okay, I did that. We're at 1.9A.

2. To a certain extent yes, but there's a limit to how cool you can keep an LED due to its small size.

It's a chip lamp, in a package designed to accommodate a heat sink.

31wiXerCBRL._SX425_.jpg



My diode array is rated at 20W and the manufacturer recommends 32V at 600mA. Yet I tried it with a 1100mA power supply and it doesn't seem as bright as it could be. Of course the LED has an internal voltage drop and resistance that are not known, so would it be okay to use without a resistor on the 60W power supply? Or would I need something like a 1 Ohm resistor for a 10W-20W lamp?
 

Mr Evil

Senior member
Jul 24, 2015
464
187
116
mrevil.asvachin.com
LEDs are non-linear, so you can't supply them with the correct current by connecting a voltage source directly to them; you need some sort of constant current source. You can approximate a current source by using a voltage source and an appropriately sized resistor, but it's not very efficient, especially for an LED as big as 20W.

The projector will have a more sophisticated sort of current source inside it, and it's the current and voltage output from that that you need to measure to know the power of the projector's LED.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Does the projector manufacturer not list that information in the device specs/documentation? Or do you not trust their information and want to measure it on your own?

I don't have anything as technical as the actual wattage of the LED.

I read online that a particular model of projector uses a 50W LED and is 5000 lumens. My projector is 1500 lumens, so by extrapolation I'm guessing it's a 20W LED. So perhaps I can use my new LED without a resistor. If I were to need a resistor, it would be something of very low value and very high wattage, which I don't have.

Is it possible to change the current output of the power supply itself by swapping out a component?
 

mindless1

Diamond Member
Aug 11, 2001
8,075
1,452
126
Can you just tell us exactly what you're trying to do? Are you replacing the original LED in a projector with a random one instead of same thing, or converting the projector to LED, or ??

That picture looks like a 100W LED, in a 10 x 10 array of 1W each LED dies. That would be consistent with the 32Vf you mentioned but the 600mA, yes that would be spec for a 20W LED. The thing is, ALL the 20W LEDs I've seen with a 32Vf, only used two rows of 10, 1W dies for 20 total, not 100 dies as pictured.

That doesn't mean you can run it at 100W unless you have sufficient heatsinking. It could easily be that it'll be under-driven to meet the thermal limitations and that's how it would arrive at 1500 lumens. It needs to be mounted on the heatsink you're going to use then you can measure the temperature at any particular current level, though of course if it's going on a projector than it'll need to be closed back up and let sit running to reach max temperature.

If you can provide a picture of the heatsink, and fan if so equipped, I might be able to make a rough guess at what wattage it can handle and still have good lifespan but the thing is, some of the generic Chinese LEDs have defects that waste a lot of power as heat so really a temperature measurement is the only safe way to know what is going on.

If you just want a way to drive an LED to 20W, mains AC input LED drivers for that are not very expensive, maybe $5 on ebay, or a couple bucks more in its own plastic enclosure, but if the original projector illumination source was LED, odds are it already has a regulated driver circuit. I'm now back to wondering what you are trying to do.
 
Last edited:

PowerEngineer

Diamond Member
Oct 22, 2001
3,553
726
136
Okay, I did that. We're at 1.9A.

I do not know anything about LED chip lamps, but if the power supply is DC then you can calculate the power consumption my multiplying the current through the device (which you have measured already) by the voltage drop across the device (which should be easy to directly measure).

Using the manufacturer's recommendations, 30V times 600mA is 18 watts.

Hope this helps...