That'll be fine.
For power supplies like this, match the voltage of the original (both the volts and the direction - AC or DC), polarity of the plug, and get a current rating that's greater than or equal to the original power supply.
The current rating of the power supply is its maximum possible output. So if you've got a power supply that can output 700mA, but the device it's powering is only rated for 100mA, that means that the power supply will simply have spare and unused capacity. The device will only take the 100mA it needs.
Voltage is essentially a measure of the amount of motivation to move that's been given to electrical charges. Current is a measure of how many electrical charges are available.
So you want to match the voltage of a power supply to what the device requires.
If a power supply is rated for a higher current output than the device needs, that means the power supply is capable of supplying more current, but the device will only take what it needs.
Your wall outlets are likely rated for 120V and 15 amps. If you plug a 60W 120V light bulb into the outlet, it will take 500mA. If you plug that same light bulb into a 120V 200A feed, it will still only take 500mA.