Never. This is the same installation that began with my first HD (60 M, thats M not G) and Windows 3.0. Before that, there was no installation to keep, because it was just a stack of floppies. I've been through many motherboards (obviously), and may more HDs.
And I never will, as long as I can find away around it. People that do it must not use their computer for much. If they did, they would find reinstalling impractical. If I could somehow determine just what was necessary to get my computer the way it is, I think it would take several weeks of installing things to get it back.
I just copy the old HD to the new. New mobo, I delete the old drivers, and install the new. New OS, upgrade over the old. If something doesn't work, I use elaborate debugging procedures to determine what is necessary to make it work. On one occasion, going to an Intel BX style mobo, I had a non-useful main installation for a week, but I did pin down what was necessary to make it work. If you don't care whether you keep your installation or not, of course it would be ridiculous to go through all the trouble I sometimes have.
For testing and debugging, I always have two installations of Windows on the computer besides the one I use. With XP, I decided I would not upgrade the main installation until I had everything working in one of the auxillary installations. That turned out to be very wise, but not even that turned out to guaranty that everything would work in the main installation.
Really, until W95, there was not much difference between a re-installation, and deleting a few files or editing some lines in SYSTEM.INI and WIN.INI. With the expanded role of the Registry since 95, they turned what was a minor bug in Windows into an insideous cancer. It was not until Plug and Play that installing a device driver became potentially impossible.
If Windows requires reinstallation from scratch because of a hardware change, or for no change at all, it just shows what a pathetic piece of trash it is at its core, and how ultimately undependable it is. I always chuckle to myself when I see yet another person brag up how great XP is. There is a lot more broken about XP as compared to 98se than is fixed. For an average end user who maintains his own computer, XP is a giant leap backwards. It is a completely disorganized, intricate, labrynthian mess. (Probably a reflection of MS's founder's brain organization.) True, when it works, it seem miraculous. When something doesn't, it is virtually impossible to pin down why. I suppose it is because of all the "advanced" concepts put into it. It is so advanced that not even the programmers that program it can understand it. To make XP useable, they hope to create programs (Wizards) that can understand it ... someday.