None of that speaks to device reliability
You keep talking device parametrics which is not the same thing as reliability.
I suppose that you would like to know what are the limits
of the process , that is , the frequency/voltage function
as well as the upper limit of the supply voltage , wich would
yield the max frequency limit..?...
From the graphs , we can safely assume that it will
work reliably at 0.7/0.8V for frequencies up to 1/1.4ghz.
Max supply voltage will be inherently lower than SB , likely
that the 1.3V mentionned above will be the absolute Vds max.
As leakages increase dramaticaly with node shrink , it make sense
for Intel to focus on lower voltage and better efficency rather
than on frequency wich cant be increased otherwise than
with increasing the supply voltage.
As i mention it above , the fets speed depend not only
on the parasistics capacitances but also on Gm , that is,
the conductivity of the device.
(conductivity is the reciprocal of resistance , as such
its value is 1/R and the unit is in Siemens).
Basicaly , the time to charge a capacitor C is
T = R x C = C/Gm.
As such , with a 30% reduction in conductivity , if it was
the same node , the max frequency would be reduced by 30%,
but since the node shrink allow for theorical halving
oif the parasistic caps , the device should be 20% faster
despite its reduced Gm , probably a little less since
halving of the conductive tracks capacitances cant be attained ,
but still , a figure of 10% is possible , wich correlate well
with Intel s picture of 7% faster device at normal VDS ,
but i can only insist on the fact that the better performance
of the device is firstly in its capacity to work at low voltage,
while performances at the other side ,i.e , high frequency and
HIGHER voltage will not bring miracles even it seems to be the most desired feature by the geeks that hang by there...