Originally posted by: OpStar
Fiasco my ass. It was a delayed launch, of a card that required loud, extensive cooling because of the DDR II. Need we remember that this whol "fiasco" was because of TMSC and not nVdia. They are the reigning market holder in the gpu market, and they decided to be forward thinking with their next gpu, ala the nv30, because they are aware that the market is flooded with cards that are just faster and faster at the same old thing, but really offer no innovation for the money.
It's not nVidias fault for trying to move to 0.13 micron too soon then?
Faster and faster at the same old thing? ATis 9x00 range mostly offered much better performance at AA and AF, something which hadn't really been possible with previous cards, so they became fast in a new area, and faster at the same old thing.
The 9800 also loops shaders, so that kinda nullifies part of the FX's shader engine.
Extensive cooling because of DDR-2? Not because of the core speed? Most GFX card heat comes from the GPU, I think it was the GPU not the DDR-2 that necessitated stupid cooling.
The FX Ultra is not without flaws, pretty much like any hardware part. What it is though, is the testbed for the progression to .13 micron (which even ATi benefitted from) and it will allow nVidia to scale flawlessly on the next few cores that are in the .13 process. I for one congratulate them on forward thinking.
I congratulate ATi on being able to push the 0.15 micron process as far as they have.
It's not really forward thinking, its natural progression. Sure they went there first, but ATi were close behind, it was a cae of when really, and their choice to move to 0.13 micron lost them ground, which isn't really great.
All the rest of their releases will benefit from the "fiasco" that was the FX Ultra (a fast card, that is dx 9 compliant, and able to keep up with ATi's top of the line offerings).
Shame on nVdia. They shoulda just made a 256bit version of the 4600, with more ram, and a higher core speed, on the same .15 process, and charged 400 for it, for a whole other product cycle, then went to the nv30. This way, they woulda got all your money, and launched the .13 part on time, and all you naysayers would have nothing to complain about.
ATi would have held the performance crown, there's no way that nVidia could have competed in the AA/AF department without some fairly big changes to the core IMO.
Sure ATi leads owns the speed crown. On an old .15 process, with no innovation, and far from the greatest drivers.
Stop using the driver argument. nVidia drivers have many problems as well. It can quite often be the users fault as well, and ATi certianly seem to come out with new drivers fairly often and try to fix the problems, and their drivers aren't that bed, even Carmack said so.
And surely being able to lead the crown with an old process, is quite an achievement, their GPU is slower, and yet the card still performs as well as the 5800.
And no innovation? Meaning....?
I don't really see innovation that's really different, and if it is there, where's the use for it?
I for one am glad that nVdia is trying to change the way gpu's are made. I'm sure with the nv35, you all will be as well.
Changing the way GPU's are made? How? Have they stopped using transistors? Have they embedded RAM?
Basically, they have done things that ATi just delayed doing slightly. Sure, they hit 0.13 micron forst, but ATi wasn't far behind and it was the next obvious change.
Other "changes" are just the same as would have happened anyway, it's not so much of a change, it's just a progression.
AMD's Hammer is changing the ways processors are made, Intels Prescott is not.
I still think calling the nv30 a fiasco is a bit overstated don't you. It does everything it is supposed to do. It was just too forward thinking for its time.
Yeah, cos it came out 6 months after the 9700 and failed to really outperform it, and it was much hotter and louder, so it was, in pretty much all areas, a worse product, plus the 9700 was pretty much cheaper if you looked in the right places, so it was a bit of a fiasco, especially the fact it was continually delayed. The delays make it a fiasco.
It amazes me that consumers won't pay for 400 for a .13 card, with DDR II becuz of the cooling solution, but they will pay 400 for a 9700 Pro with ramped up core/mem speed, and some minor core revisions.[/quote]
Some of the shader pipelines in the FX may be innovative, and the first use of DDR2 may be a first, but it's not innovative.
And DDR 2 with a 128 bit mem bus is a little silly in some ways, since going with 256bit bus would have been more sensible.
So that adds to the flaws of the FX.
"NVIDIA makes up their performance advantages in their memory architecture, higher core clock speeds and overall efficiency"
Their mem architecture is nothing really special, ATi also has compression techniques, Hyper-Z III which woul doffer probably more than the 48GB/s bandwidth nVidia can claim they can peak at.
Efficiancy is not really true, something running with slower frequency, but equalling performance is surely more efficient, ala AMD/Intel.
"The end result of this compression engine is that anti-aliasing now becomes a very low cost operation, since very little memory bandwidth is wasted"
" Because of the compression engine, performance with AA enabled should be excellent on the GeForce FX."
"NVIDIA claims that their anisotropic filtering algorithm is more precise than ATI's, so the GeForce FX's anisotropic filtering should look just as good if not better than the Radeon 9700 Pro's."
"NVIDIA's FX Flow technology supports a wide range of speed levels to run the fan at; at its loudest the fan is no louder than a noisy Ti 4600."
A lot of the comments about the FX are not really all that accurate.
"The compression engine is completely invisible to the rest of the architecture and the software running on the GeForce FX, which is key to its success. It is this technology that truly sets the GeForce FX apart from the Radeon 9700 Pro."
That could be innovation, but it's not good innovation.
Was gonna add more, but can't be bothered for now, I'll just wait to be countered.