Overclocking the CPU helps in CPU intensive tasks (unzipping huge files, for example), but it doesn't make an large enough impact on gaming to make it worthwhile.
The answer to the other question is a bit more complex. Both CPUs and GPUs essentially are doing math: logical operations on values stored as 1s and 0s in the memory. Now, the computer can only distinguish between 1s and 0s by the amount of electricity present; a 0 is a low amount while a 1 is a higher amount. A computer calls anything above a certain threshold a 1.
Overclocking means your computer tries to do these calculations faster, so it has to move around and change the values more times per second. When it goes too fast, it starts to get sloppy, and stores 0s as 1s or 1s as 0s by accident. Then errors and glitches start happening.
To adjust for that, you can tell the computer to use a higher threshold and move around more electricity so the margin of error when changing the values matters less. But moving around more electricity produces extra heat.
In a laptop, the parts are very carefully calibrated to give a certain performance within a very tight heat envelope, as there are far weaker fans and less air moving around in a laptop compared to a desktop. Overclocking can be done to a small extent "for free": that is, without changing the voltage to compensate for errors. But that's pretty small on a laptop since almost all of the performance has been eked out already. Altering the voltage even a tiny bit is going to result in some pretty significant heat increases, so it's usually not worth it on a laptop.
This is not 100% technically correct, but I figure it's good enough to explain voltage and it's role in overclocking.