How voltage changes the operation of the PLL loop can only be explained fully by the Intel engineers that designed it. It's essentially a feedback mechanism (frequency synth VCO etc) that keeps the output clock frequency time aligned (in phase) with the reference clock. As the output clock frequency is increased, downstream sampling margins are reduced if there is any phase variance between the CPU clock and any related sub-domain that obtains its reference clock (PCIe, DMI etc) from another source or another output node of the master clock generator (transmission line variance and other factors).There is also the chance that the level of voltage applied has an impact in the output clock jitter (higher levels of jitter will reduce sampling windows).
By changing the PLL voltage at the CPU side, you are either making a very subtle change to the oscillating frequency of the VCO (it is a voltage controlled oscillator after all), or you are affecting the feedback loop of the PLL (bandwidth and gain). This can alter the output frequency such that you either make the downstream sampling margin better or worse. The effects of PLL voltage manipulation will vary from platform to platform depending upon the implementation and perhaps even temperature drift of the oscillator (insight to why things change when you go cold).
This stuff is just the tip of the iceberg, anything deeper and one really needs to be an EE to really get a grasp.