Never seen it documented? You can ask PM....he doesn't come here anymore due to flamers, but he works at Intel, and did a lot of research into overclocking its effects longer term. If the Hightly Technical board comes online, you'll be able to talk to him here, but otherwies, you'll have to email him. Do a user search for him, if you want his email address.
It does indeed lower the life expectancy of chips. The only matter is by how much. One way they can die is from overheating, another electromigration (where the electrons litterally strip the CPU of silicon).
EDIT, also, increasing the voltage is a BAD thing. It increases wattage approximately proportional to the square of the voltage. Increasing voltage can sometimes increase stability, but also remember what voltage is - electrical potential.
AMD and Intel have tolerances and specifications for a reason - sometimes you can go beyond them, (and yes, I do too, I overclock sometimes) and sometimes you can't. Going over the clock rate isn't usually a terribly bad thing, but going over the voltage spec (which, I believe, tends to be +-10% of the spec'ed voltage) tends to be a bad thing. AMD and Intel don't go over their initial spec voltages for their CPU's
For example, if the speced voltage for a chip is 1.65v, the later CPU's don't usually go over ~ 1.75v. Later revisions of the chip might, but a revision is technically a different chip (not by a lot, usually), but it will have different tolerances, and thus have different spec'ed ranges of operation.
My 2 cents.