NaughtyusMaximus, no it is not just a V=IR. However, you get the right general results. You just don't show where it comes from.
The LED works as a resistor too. Generally one would put the resistor in series and get a voltage divider.
When you say that the LED is rated at 2.1V, is that the Vf (Vforward, Vthreshold, or something like that) of the LED? If that is the Vf of the LED, it will not turn on (LEDs are a type of diode) unless it is given that much voltage. Generally 1.2 or 1.5V is not enough to turn on an LED. Now if you put two AA batteries in series for 3V then one would use a current limiting resistor.
[Edit: Some of the flashing LEDs have a Vf of 1.2V.]
Let's say you were using a 3V voltage source and you wanted to pick the resistor for an LED with a Vf of 2.1V:
3V - 2.1V = 0.9V
So we want a 0.9V drop across the resistor. If we ran it at maximum current:
0.9V = 0.020A * R
R = 45 Ohms
With two 1.2V batteries in series one would have 2.4V - 2.1V = 0.3V
0.3V = 0.020A * R
R = 15 Ohms
I would not run the LED at its maximum rated current as that will shorten its lifetime. Try using 15mA:
0.9V = 0.015 * R
R = 60 Ohms
or
0.3V = 0.015 * R
R = 20 Ohms
Now for the wattage of the resistors, W = V * A
W = 0.9 * 0.015 = 0.135
W = 0.3 * 0.015 = 0.0045
1/4 W resistors will work in either case. Those are the thin ones you find at most electronic stores. The thicker ones are 1/2 W. Obviously 1/8 (Most surface mount I believe) and higher than 1/2 exist, but if you go down to your local electronics store you will most likely come across the 1/4 and 1/2. You can always ask what the wattage of the resistor is before purchasing it.