Most electrical power sources are run by systems that control the voltage. For example, in North America and many other places, the wall outlets provide 115 V as Alternating Current (AC). The utility company works hard to make sure that, no matter how much current is being pulled through their systems by thousands of users, the voltage in their wall outlets stays the same. When you plug in an appliance (say, a baseboard heater) its working component, the heating coil, has a fixed resistance. Thus, the current in Amps that can flow through that kettle's heater is determined by those other fixed factors: Amps = Voltage divided by current. Now, you will note that many baseboard heaters have visible heating coils that get red hot when operating. That is because the WORK being done by this electrical device is all being converted to HEAT and radiated out into the room. You MIGHT try to get even more heat (work) out of this heater by supplying it with an even higher voltage, and thus forcing more current through it. However, the coiled wire would get so hot the metal would melt and the wire would break, stopping the circuit completely. This is because the heater designers carefully made the unit so that, given just the right voltage supply, the heating coil would reach just the right temperature so that the rate of heat escaping the coil into the room exactly matches the rate of heat being generated in the coil by the electricity flow. So it is not a good idea to try to give that heater more voltage.Similarly, a motor might run faster if you fed it a higher voltage, BUT it also would generate a lot more heat inside its wire windings, and the heat would cause melting of metal and failure that way.
We measure electricity use in WATTS, and others have pointed out that Watts = Volts x Amps. Watts is a unit of the RATE of doing WORK, just as Horsepower is a Rate of doing Work. In fact, you can convert one to the other: 1 HP = 746 Watts. So OP's coffee maker (uses 120 V at 7 Amps = 840 Watts) uses a little over 1 Horsepower. Heating water takes work. To heat one gram of water by one Celsius degree takes one Calorie. It turns out that one Watt provides 0.2388 Calories per second. Since OP's coffee maker converts 840 Watts to heat, it can heat at the rate of 200 calories per second. Now, let's say the water put into the pot was at 20 C - room temperature - and it has to get to 100 C. So the heater has to raise the water temperature by 80 Celcius degrees. It can do that in one second for (200 / 80) = 2.5 grams of water. (In truth, the heater has to give the water lots more heat than that to actually make some of it boil, but that's more complication.) This is just to illustrate that the WATTS that came from electricity does some useful WORK on something - it heats the water.
Other types of electrically-powered machines do different types of work for us, but each always is just a machine to convert electrical energy into other forms of energy like heat and mechanical movement, at a RATE that depends on the Watts of energy being consumed. A higher wattage rating on a machine means it has been designed to consume and convert electrical energy at a faster rate; so, a larger motor may be able to power a larger fan and push more air around the room.
Now, on to car batteries. Ideally, a battery also is a source of a fixed voltage. In cars it is usually about 12.6 VDC. But no battery is ideal. In fact, each behaves like it has inside its case both an ideal battery that always provides a fixed voltage, PLUS (in series with it) a resistor that reduces the actual voltage available at the external terminals according to how much amperage is being pulled out at that moment. Then there's another complication. A battery stores electrical energy by using reversible chemical reactions. When it is charged after first being built, the charging current causes chemical reactions that use up a bit of its material to make new chemicals. These are stable and are the source of the voltage you can measure with a voltmeter. After that if you connect a load (say, a headlight) to the battery, the voltage will cause current to flow though the load. For that to happen, those same chemical reactions have to proceed in the OPPOSITE direction, regenerating the original chemicals in order to release the electrons needed for that current flow. When this happens, the supply of the "charged-up" chemicals is reduced, and hence the voltage is reduced slightly. If this continues for a long time, eventually the voltage will be reduced a lot, and we have to re-charge it to restore the battery to full voltage. That is why a car has a generator (powered by a belt from the engine) that provides a charging current to the battery as you drive.
OP, you asked why a car battery is rated in Amps? Good one, and the reason is that apparent internal resistance. Also, they are not being completely precise in making the statement. The specification is often for "Cranking Amps", and what it means is that during the use of the starter motor to start the engine, the battery will provide that many amps current to the starter while still maintaining a minimum voltage to the starter of (some particular number, probably about 8 to 9 Volts). You may have noticed that the Cranking Amps number is big - often 300 to 500 Amps - because that starter motor has to do a LOT or work to turn over the engine and get it going. If the battery is too small, or if it is so old it cannot perform as specified, when you try to use it the heavy current flowing through the starter motor will reduce the actual VOLTAGE getting to the starter and it won't get the required total power (Watts = Volts x Amps) to do that work. That is one of the consequences of that Ohm's Law thing that says Amps = Volts / Resistance. Re-stating that equation, Volts = Amps x Resistance. So, for that apparent internal resistor in the battery, the voltage drop across it is its resistance times the current being drawn. For a current draw of 300 amps, the battery terminal voltage could drop by 6 volts if the internal resistance is 6/300 = 0.02 Ohms. In such a case, in fact, the battery probably won't even deliver that 300 amps the starter wants - it will fall a some lesser point - and the starter won't turn over the engine fast enough to get it going. In fact, usually when this happens the reserve of "charged-up" chemicals in the battery is so low that its output decreases rather quickly, and the stater motor pretty soon stops turning at all. That's when we know we have a "dead battery" that won't start the car.
So, although a car battery is designed to deliver a fixed voltage to a load you connect to it, it has real limits on what it can do. And for us people who want the car to start reliably, an important other specification from the battery maker is how much real current it can provide under heavy load to turn over the starter motor (and hence the engine).
For people like me who live in colder areas (in the winter), you'll also see a specification for "Cold Cranking Amps". You see, the chemical reactions that go on inside a battery are affected by temperature - the lower the temperature, the less the output voltage - and hence, the lower the current it can push through the starter. So the battery makers also specify a Cranking Amps at a particular cold temperature so you can decide whether it is good enough to work in your cold winter weather.