sorry but can you explain this boot time calibration?
At boot time the CPU, since Carrizo, perform a measure of the TRUE Vcore erogated by the VRMs both without loading the CPU and with a fixed load. So the CPU calculates the DC offset and AC coefficient to compensate, later, the differences with the theoretical required Vcore. E.g.: the CPU orders to the VRMs to give 1V and the CPU measures the TRUE Vcore given by the VRMs both with ilde CPU and a fictious load (maybe one on more cores executing predefined power hungry code). With this, during normal operation, whenever the CPU decides that it needs X volt, it calculates the actual Vid that it must give to the VRMs to have exactly X volts, so you can lower the Vcore margins that you usually have and avoid malfunctions in case the VRM shift is over the tolerance.
EDIT: this also helps loose tolerance margins on the VRMs, meaning cheaper motherboards: if the VRM tolerance is higher, you can use low quality VRMs...
This calibration also compensate for VRM, PSU, CPU and MB ageing, that imply VRM voltage shifts during the years...
BTW also Fiji has boot time calibration.