Depends on your exact driver setup and what you're trying to do.
If you're using generic default Microsoft Drivers for something (like a GF2), then you don't want to "delete" them since they come with Windows. But if you've installed manufacturer's drivers, then you might want to so that you don't have any "old" drivers sitting on the drive when Windows is looking for drivers for a device.. What you'd need to do is look at the current driver list for the device (before uninstalling), and delete anything that isn't from Microsoft (you have to write down a list, and probably reboot into Safe Mode so that Windows isn't trying to use those drivers, otherwise you'll get Access Denied errors). Or you could change the display driver to standard VGA, so that the device-specific drivers aren't in use anymore.
Then you'll be able to do a "clean" install of new drivers.
If you're updating Detonator (and probably ATi Catalyst) drivers, then an uninstall routine is provided, which deletes the drivers from your system (instead of doing it manually as I described) and when you reboot, Windows will detect the video card again. I've found that it is unfortunately best to uninstall Detonator drivers before installing new ones; would be nice if it could all be done in one go.
If you're removing a device entirely and switching to something else, you don't really need to remove the drivers unless the card is based on the same chipset. When you uninstall it in Device Manager, the drivers are still on the hard drive so they're "available"; if you put another card in with the same chipset, Windows will use the same drivers, unless you actually delete them from the system. If you put in a different chipset card (and keep in mind that Detonator and Catalyst drivers support an entire range of chipsets with one driver set), then Windows will prompt you for the proper drivers unless it has generic default drivers (after that, you can just install the driver package and it will stop using the generic ones).