ECC memory is error-correcting - it can repair 1 bit out of (typically) 64 bits being incorrect and can detect an error in 2 bits out of 64 bits. Memory errors can be caused by faulty hardware, but are more commonly caused by alpha or gamma strikes from cosmic radiation, and less often, from radiation emitted by devices in the machine (ie. the lead solder used on motherboards and cards could contain trace amounts of radioactive materials). How common cosmic radiation is is a debatable point: estimates from IBM indicate that it could happen as often as once per month at sea level in one 128MB stick of DRAM, Micron says more like once a year. At higher elevations, the problem becomes substantially worse.
I take the unpopular stance on this board that, if you value your data, you should have ECC memory. Most everyone else on here disagrees with me (does anyone agree?), saying that only servers need it. I use it on one of my two machines - the one that I use to log into work - but not on the other (the one that I usually use to play games and experiment around with).
A few facts:
1. ECC memory is more expensive.
2. ECC memory will result in a minor performance hit - 3-5% in memory benchmkarks, substantially less than this in system level benchmarks.
3. ECC will increase your systems data integrity and uptime - but by how much is debatable... not only by the members of this BBS, but also by the corporations producing memory.
A list of some modern chipsets/boards that support ECC is
here.
Note: one correction to an above post - ECC in SDRAM requires 64-bits to work. I believe that the minimum for 1-bit correction is 58 bits, but I'd have to look it up to be certain. Since memory is usually organized in 8 bit chunks, 64 bits is the size of choice on SDRAM.