In an internal combustion engine, a piston compresses a large volume of a mixture of fuel and air into a very small space. The ratio of the maximum piston volume to the minimum compressed volume is called the "compression ratio." Compressing the fuel and air will make them burn faster, which (though I'm not sure directly how) makes the engine run better. Due to the high compression ratio (12.51) of the 11,000 RPM Hayabusa engine and the low compression ratio (9.8:1) of the 6500rpm Mustang V8, I'm guessing that this allows for a much higher redline - the faster burn speed of the compressed fuel-air mix in the Hayabusa engine would allow it to complete burning before the piston had completed its stroke at high RPMs. There are secondary benefits to high compression ratios, too. High compression ratio engines burn both much more cleanly and much more efficiently than lower-compression engines. For example, a diesel engine, which burns fuel very differently to a gasolene engine, will often give fuel economy 60% greater than its gas equivalent, even though diesel only has about 10% more energy per gallon. According to Wikipedia, the increase in efficiency is due to the additional heat and brownian motion caused by compression fully vaporizing the fuel, which I think sounds a little fishy considering how much work is put into cooling the fuel-air mix in turbocharged cars. Most other websites say that it's due to the Carnot cycle, which I honestly do not understand - could someone explain it? Another issue is engine efficiency as a function of RPMs. An engine limits power by reducing the intake of fuel and air to an engine; if only half the fuel and air is entering a piston, the compression ratio is effectively halved as well. Considering all the advantages of high compression, one might wonder why anyone would not use a high compression ratio. The answer is simple: The increased heat density of the compressed gas will cause the fuel to begin combustion without ignition by the spark plug, resulting in an undesirable burn pattern. This detonation, or "knock", is often heard as a pinging noise and can cause severe damage to your engine. The measure of a fuel's minimum ignition temperature and resistance to detonation is its octane, which is not, as is commonly understood, a measure of it's energy per liter. Before the advent of the catalytic cracker, fuel was often below seventy octane, and engine compression ratios were low - a Model T had a compression ratio of 4.5 to 1. However, by splitting large molecules into smaller ones (cracking), modern engines are both more efficient and better performing than their older bretheren. Modern gas has an octane of about 93 for premium in the US, and about 97 in the rest of the world. 100+ octane gas can be had, but it's very expensive (well over $5/gallon) and often has octane-boosting additives which contain lead. However, all of these are dwarfed by ethanol, which as an octane rating of a whopping 129. As a result, it can easily be used in engines with a compression ratio of 15:1 or greater, and despite having an energy density only 2/3 that of gasolene, it should produce similar fuel economy in such an ethanol-optimized engines along with very, very high redlines. So, any thoughts on any of this?