Why should GF100 be compared to Hemlock if the performance disparity is expected to be so large in a key metric of relevance?
Is GF100 expected to cost $600 like Hemlock does?
I understand nothing prevents one from making the comparison, I just don't get the preconditioning statement "should" in that sentence.
One could compare a really large number to a really small number any time of the day, but why should one do it?
I think the interest in here is to guess the shader core speed.
Additionally we have last gen shader count and speed.
Sure, Fermi is a new architecture and blah di blah, so we will have to wait and see how it does in game.
But if last gen was 800 vs 256 and now it is 1600 vs 512 and the clocks are similar, that can possibly (all guesswork that can be completely wrong) indicate that Fermi isn't going to outperform the 5870 by that much.
Then we have xtor counts and estimated sizes. Those seem to point that Cypress is cheaper to make than Fermi.
Yep, loads of guesswork that in the end can amount to nothing.
On the other hand if Fermi shader clock was 3GHz, assuming Fermi would spank Cypress in current games wouldn't be farfetched.
Guess we can compare this to the guesswork of Cypress performance a few months ago when we start seeing specks that pretty much doubled the 4870.