Unfortunately, the kill-a-watt would only be able to tell you how much your electric bill would be impacted. In order to know how much the speakers will draw from the amplifier, you have to measure in between the speakers and the amplifier. This requires a multimeter where you probe at the speaker terminals.
If you cannot take measurements, then its too difficult to make a conclusion because you would be making assumptions on top of assumptions based on manufacturer-specifications for both receiver and speakers. Some amplifiers have trouble driving current at extreme phase angles, some amplifiers have trouble driving high amounts of current into low impedence speakers (and these details are not given by manufacturers). If you can find a review of your speakers which shows measurements of its reactivity (impedence vs. Freq., Phase angle vs. Freq., sensitivity of speakers vs. Freq.) and a review of your amplifier that shows measurements of how it can handle varying impedence w/o inducing an unacceptable amount of IM distortion or other kinds of distortion; you can make a guestimate with the understanding that there is a large margin of error.
In real life minus academics, if your amplifier clips with normal program material, you need a better amplifier. If it doesn't clip, but you are still curious, borrow the new receiver and A/B between them but make sure the voltage drop at the speaker terminals are within 0.1mV between the comparisons, otherwise the slightly louder receiver will most likely sound better (psychoacoustics).