My first post here.... yes it is.
I have this x1900 all-in-wonder card retail box sitting next to me, in the box, glistening, looking sexy. You naughty thing you *growls and claws at it*
However, I am a real silent enthusiast and this card isnt a typical purchase for one. Thermally, a 7800GT or 7900GT is optimal for a silent setup. (less heat than a 9800 XT!) Yet, the nvidia options for multimedia and goodies is eternally pathetic, so I never buy them. This would be my 4th all in wonder card to date. So, I kinda need one for my style.
Anyways, it's 90nm. the 80nm is "soon out", as in, like not really soon but around late fall. I COULD wait for it then. but here is my question:
does a shift from 80nm to 90nm bring:
a. same mhz and noticably cooler operation
b. same mhz and like less than 10% lower wattage, aka, not really noticable
c. more mhz for the AIW on account of more headroom with single slot cooler
d. probably wont come out on the AIW anytime soon as this card is apparently difficult to design as it is.
e. lots of marketechture and more profits for cheaper chips but no difference at all for anything.
I saw the link for "80nm, anything besides die shrink?" and someone said gddr4, which honestly, I doubt they would toss in for just a minor shift. Of course, the higher up the gddr number, the better performance for watt you get, but I want the SAME performance for less wattage, something that doesnt make products in vid card industry sell well, as this is a kiddie OMGOSH I HAVE l33t OC CARD NOW world.
well, should it stay or should i wait and get my $449 dollars plus tax back?
hi.
I have this x1900 all-in-wonder card retail box sitting next to me, in the box, glistening, looking sexy. You naughty thing you *growls and claws at it*
However, I am a real silent enthusiast and this card isnt a typical purchase for one. Thermally, a 7800GT or 7900GT is optimal for a silent setup. (less heat than a 9800 XT!) Yet, the nvidia options for multimedia and goodies is eternally pathetic, so I never buy them. This would be my 4th all in wonder card to date. So, I kinda need one for my style.
Anyways, it's 90nm. the 80nm is "soon out", as in, like not really soon but around late fall. I COULD wait for it then. but here is my question:
does a shift from 80nm to 90nm bring:
a. same mhz and noticably cooler operation
b. same mhz and like less than 10% lower wattage, aka, not really noticable
c. more mhz for the AIW on account of more headroom with single slot cooler
d. probably wont come out on the AIW anytime soon as this card is apparently difficult to design as it is.
e. lots of marketechture and more profits for cheaper chips but no difference at all for anything.
I saw the link for "80nm, anything besides die shrink?" and someone said gddr4, which honestly, I doubt they would toss in for just a minor shift. Of course, the higher up the gddr number, the better performance for watt you get, but I want the SAME performance for less wattage, something that doesnt make products in vid card industry sell well, as this is a kiddie OMGOSH I HAVE l33t OC CARD NOW world.
well, should it stay or should i wait and get my $449 dollars plus tax back?
hi.