I've been waiting for GFFX for very long with anticipation. But this is something wrong.
Ridiculous heating, cooling system, amazing(?) clock speed, whopping(?) performance,
release delay, and then this immature drivers for that long preparation time.
The whole sum of these are self-contradicting. What's happening here?
I'm making my 2 cents here...
While nVIDIA was babbling about 130nm process all the time, i'm feeling there is something behind.
I mean...they were agrandizing themselves with arrogant contentment, underestimating ATi.
This leads to being overjoyed out of no reason, implementing those immense overspec of DX9
and some confusing new technologies like dynamic pipelines.
But with this, they wasted their precious transistors on nothing and lost too much performance power.
Even worse, 9700pro was more than expected and almost upto its paper specs.
So they couldn't beat 9700pro with prototype GFFX.
However, they couldn't just scrap FX and re-design all over again at that point especially when
9700pro was out already. So their only option left was to overclock it.
That's ok. But it generated too much heat to overclock enough to outperform 9700pro.
So they needed some heat reducing technologies. Didn't they mention about such things?
But they failed to get those techs to fruit.
I see there's no or little of such techs on GFFX considering the current heat generations.
Anyway, they failed and alternatively / inevitably chose this grand eyecandy cooling system. But even with that,
they couldn't handle the heat problem enough to get things right on 130nm wafers. What's the temp? 150F? or 70Celsius?
Wasn't it a bit weird while TSMC could handle other 130nm stuffs good, they couldn't FX for what? about 6months?
Considering these things, then i can gather together why its driver is so immature contrasted to their previous drivers.
So to say, They had no real GFFX board on their hands regardless of 130nm process. And therefore they had
no good enough time to mature the driver.
If i were CEO of nVIDIA, i would have sent some kind of DX9 benchmark program of their own with the package sent to review sites.
For it is the only thing left that GFFX can boast its muscles.
But they didn't. Why?
Overly pressed, they perhaps missed the timing to release it, or its results no better than current ones.
I don't know which is true. But the latter seems more likely to me to be the reason.
Am i assuming too much? maybe. it's all due to 130nm. and i hope so. for i've waited for this damn thing too long.
But something deep inside tells me this is not the case.
Ridiculous heating, cooling system, amazing(?) clock speed, whopping(?) performance,
release delay, and then this immature drivers for that long preparation time.
The whole sum of these are self-contradicting. What's happening here?
I'm making my 2 cents here...
While nVIDIA was babbling about 130nm process all the time, i'm feeling there is something behind.
I mean...they were agrandizing themselves with arrogant contentment, underestimating ATi.
This leads to being overjoyed out of no reason, implementing those immense overspec of DX9
and some confusing new technologies like dynamic pipelines.
But with this, they wasted their precious transistors on nothing and lost too much performance power.
Even worse, 9700pro was more than expected and almost upto its paper specs.
So they couldn't beat 9700pro with prototype GFFX.
However, they couldn't just scrap FX and re-design all over again at that point especially when
9700pro was out already. So their only option left was to overclock it.
That's ok. But it generated too much heat to overclock enough to outperform 9700pro.
So they needed some heat reducing technologies. Didn't they mention about such things?
But they failed to get those techs to fruit.
I see there's no or little of such techs on GFFX considering the current heat generations.
Anyway, they failed and alternatively / inevitably chose this grand eyecandy cooling system. But even with that,
they couldn't handle the heat problem enough to get things right on 130nm wafers. What's the temp? 150F? or 70Celsius?
Wasn't it a bit weird while TSMC could handle other 130nm stuffs good, they couldn't FX for what? about 6months?
Considering these things, then i can gather together why its driver is so immature contrasted to their previous drivers.
So to say, They had no real GFFX board on their hands regardless of 130nm process. And therefore they had
no good enough time to mature the driver.
If i were CEO of nVIDIA, i would have sent some kind of DX9 benchmark program of their own with the package sent to review sites.
For it is the only thing left that GFFX can boast its muscles.
But they didn't. Why?
Overly pressed, they perhaps missed the timing to release it, or its results no better than current ones.
I don't know which is true. But the latter seems more likely to me to be the reason.
Am i assuming too much? maybe. it's all due to 130nm. and i hope so. for i've waited for this damn thing too long.
But something deep inside tells me this is not the case.