You can always use a power cord that is rated at a higher voltage than the maximum required by your setup. Your original cord was rated at 10A, 125V. The one you found is also rated at 10A, and it will handes 250V, twice the voltage it will actually encounter.
Bottom line - You should be good to go with the new cord.
Unless my understanding of Ohm's law is completely off, isn't that wrong? After all, relative resistance (Ohm, Ω

goes
down as voltage increases (which is why long-distance power cables are run at several thousand Volts). Which means that at higher voltages, cables can be thinner.
Thus, a cable rated for 10A 125V has a resistance of 12.5 Ω (125V/10A), while a cable rated for 10A 250V has a resistance of 25 Ω. Thus, running that cable at half voltage (125V)
also halves the current it can handle, as resistance is a physical property of the cable. A cable rated at 250V 10A thus is only rated to handle 5A at 125V (125 V/25 Ω=5A).
Of course, 5A at 125V is 625W. So you would still be able to run about ten laptops off that cable before reaching its limits. But as a general rule, you can run a cable at a
higher voltage than its rating at the same amperage safely, but not at a lower voltage.
(Another example of this is that some modern smartphones run quick charging systems at higher voltages over the same cables that previously could only handle, say, 2A at 5V. Increasing voltage then allows you to pass more power through the same cable, without changing anything else. The Quick Charge system for Samsung's Note 4 runs at 1.67A at 9V, transmitting 15W over a cable that could previously only handle 10W (2A at 5V).)