Your "Don't give me forumlas" statement means you don't really want an answer. Take it from an EE, you can't get there from here without a simple formula, but it's
REALLY simple.
All LED's have a spec for how much current they require. It is usually over a range from minimum to maximum brightness, and it is expressed in mA (MilAmps). To determine the value of the resistor you want, use Ohm's law:
E = I x R, where
E = Voltage in volts
I = Current in amps
R = Resistance in ohms
Transposing Ohm's law:
R = E / I.
Your power source = 5 volts, and your LED drops 3.7 volts. Therefore, the voltage across the resistor is
5 volts - 3.7 volts = 1.3 volts.
If, for example, the required current for your LED is in the range of 5 - 20 mA (.005 - .02 A). You will have to substitute the actual value for your LED. Therefore:
For 5 mA:
R = 1.3 / .005 = 260 ohms
For 20 mA:
R = 1.3 / .02 = 65 ohms
This means, for this current range, you need a resistor between 65 and 260 Ohms.
One of Watt's laws (for power) states:
P = E x I
The worst case power through the resistor will be at the maximum current. Therefore:
P = 1.3 x .02 = .026 watts, so a 1/4 watt resistor will be more than adequate. You can test various values in this range to get the brightness you want, starting with the largest value and working down. That will prevent burning out the LED while you test it.
PM me for more info about building it on a card. What you need depends on whether you're building one or two or many.