Geforce FX 5900 Ultra vs Radeon 9800 pro *benchmarks*

sellmen

Senior member
May 4, 2003
459
0
0
I have no idea what the article says, but here is the important part:


3DMark 2003 Scores:

ATi RADEON 9800Pro Cat 3.2: 5,496

FX5900Ultra DetonatorFX: 6,678
FX5900UItra Det43.51: 5,981

FX5800Ultra Det43.51: 5,429
 

Uclagamer_99

Platinum Member
Jul 28, 2000
2,867
1
76
i'd still like to see the AA and AF benchmarks...that's where ATI really kicked Nvidia's butt the last round
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
I believe that you still have to use application to force trilinear af NOT quality.

It's so frecking missleading nv's driver properties!

I've always hated their driver control panel.

rogo
 

Jeriko

Senior member
Apr 3, 2001
373
0
0
Just what I needed. Another card to make my decision even more gut wrenching.

I was all ready to order a 9800 Pro to go with all the other parts of my new system on the way, yet here I see this right around the corner.

If this is on the street in two weeks for $400, and I'd just bought a 9800 Pro for the same price, I'd probably gnaw through my wrist veins.

-J
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Yea, I'd definitely say June. But I do think it will beat the 9800pro across the board in non AA/AF and will probably be very close with AA/AF. I think I'd still have to opt for a Hercules 9800pro though, it is just too cool.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I believe that you still have to use application to force trilinear af NOT quality.

Quality is now what Application used to be. Application has been removed as an option, Balanced is the middle setting which is what Quality was in prior drivers with Performance remaining Performance.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I would LOVE to see a "full" release of this card by next month. By that, I mean several manufacturers, several vendors, MSRP at Best Buy.
The FX Ultra "launch" has left me a bit disappointed, mainly because I don't own one to sell as a collectable.....
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
The sooner the 5900 is out, the better. nVidia needs to get the 5800U fiasco behind them.
 

OpStar

Member
Apr 26, 2003
75
0
0
Fiasco my ass. It was a delayed launch, of a card that required loud, extensive cooling because of the DDR II. Need we remember that this whol "fiasco" was because of TMSC and not nVdia. They are the reigning market holder in the gpu market, and they decided to be forward thinking with their next gpu, ala the nv30, because they are aware that the market is flooded with cards that are just faster and faster at the same old thing, but really offer no innovation for the money.

The FX Ultra is not without flaws, pretty much like any hardware part. What it is though, is the testbed for the progression to .13 micrion (which even ATi benfitted from) and it will allow nVdia to scale flawlessly on the next few cores that are in the .13 process. I for one congratulate them on forward thinking.

All the rest of their releases will benefit from the "fiasco" that was the FX Ultra (a fast card, that is dx 9 compliant, and able to keep up with ATi's top of the line offerings).

Shame on nVdia. They shoulda just made a 256bit version of the 4600, with more ram, and a higher core speed, on the same .15 process, and charged 400 for it, for a whole other product cycle, then went to the nv30. This way, they woulda got all your money, and launched the .13 part on time, and all you naysayers would have nothing to complain about.

Sure ATi leads owns the speed crown. On an old .15 process, with no innovation, and far from the greatest drivers.

I for one am glad that nVdia is trying to change the way gpu's are made. I'm sure with the nv35, you all will be as well.

I still think calling the nv30 a fiasco is a bit overstated don't you. It does everything it is supposed to do. It was just too forward thinking for its time.

It amazes me that consumers won't pay for 400 for a .13 card, with DDR II becuz of the cooling solution, but they will pay 400 for a 9700 Pro with ramped up core/mem speed, and some minor core revisions.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
The card is a flop. Look at any review. I had a GF2/GF3/GF4 and liked each of them (still have a GF3 in one PC). When each of those cards were released, they were met with praise from the industry and owners of the cards. They were cutting edge products that were at least a generation ahead of any competitions offerings. They sold like hotcakes. Everyone had to have one. The FX is just the opposite. Poor reviews from the industry, way late, performance that barely keeps up with competitors 6 month old product. Even to do that, the clock speed had to be pushed so high that a ridiculous cooling system was needed. People are not waiting in line to buy them.

The nV30 is the only nVida card I can recall as ever having an overwhelming negative response, and for good reason. That makes it a fiasco for nVidia.

I hope nV35 is a success. I think vVidia learned from the mistakes of the nV30.
 

OpStar

Member
Apr 26, 2003
75
0
0
people aren't standing in line to buy them, cuz you can't find them.

Find me some Ultra OWNERS who don't like them.

This has been gone over and over at numerous other forums.

Yeah, its a horrible card. That is why its 50% off the sale price at every online vendor, correct?
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Find me some Ultra OWNERS who don't like them.
I know of a few former owners. There are several AT members that sold or returned their cards shortly after buying them because they didn't like them. Noise being the primary reason.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
but heck the noisy cooling is only a part of the problem, i mean if it did actually have revolutionary performance where you could crank up the eyecandy beyond everything else on the market and still get better framer ate then most people in line for a top shelf card would let the loud fan slide or go wattercooling.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
Originally posted by: sellmen
I have no idea what the article says, but here is the important part:


3DMark 2003 Scores:

ATi RADEON 9800Pro Cat 3.2: 5,496

FX5900Ultra DetonatorFX: 6,678
FX5900UItra Det43.51: 5,981

FX5800Ultra Det43.51: 5,429

Did you just say important and 3DMark in the same sentence?
rolleye.gif
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: OpStar
Fiasco my ass. It was a delayed launch, of a card that required loud, extensive cooling because of the DDR II. Need we remember that this whol "fiasco" was because of TMSC and not nVdia. They are the reigning market holder in the gpu market, and they decided to be forward thinking with their next gpu, ala the nv30, because they are aware that the market is flooded with cards that are just faster and faster at the same old thing, but really offer no innovation for the money.
It's not nVidias fault for trying to move to 0.13 micron too soon then?
Faster and faster at the same old thing? ATis 9x00 range mostly offered much better performance at AA and AF, something which hadn't really been possible with previous cards, so they became fast in a new area, and faster at the same old thing.
The 9800 also loops shaders, so that kinda nullifies part of the FX's shader engine.
Extensive cooling because of DDR-2? Not because of the core speed? Most GFX card heat comes from the GPU, I think it was the GPU not the DDR-2 that necessitated stupid cooling.

The FX Ultra is not without flaws, pretty much like any hardware part. What it is though, is the testbed for the progression to .13 micron (which even ATi benefitted from) and it will allow nVidia to scale flawlessly on the next few cores that are in the .13 process. I for one congratulate them on forward thinking.
I congratulate ATi on being able to push the 0.15 micron process as far as they have.
It's not really forward thinking, its natural progression. Sure they went there first, but ATi were close behind, it was a cae of when really, and their choice to move to 0.13 micron lost them ground, which isn't really great.

All the rest of their releases will benefit from the "fiasco" that was the FX Ultra (a fast card, that is dx 9 compliant, and able to keep up with ATi's top of the line offerings).

Shame on nVdia. They shoulda just made a 256bit version of the 4600, with more ram, and a higher core speed, on the same .15 process, and charged 400 for it, for a whole other product cycle, then went to the nv30. This way, they woulda got all your money, and launched the .13 part on time, and all you naysayers would have nothing to complain about.
ATi would have held the performance crown, there's no way that nVidia could have competed in the AA/AF department without some fairly big changes to the core IMO.

Sure ATi leads owns the speed crown. On an old .15 process, with no innovation, and far from the greatest drivers.
Stop using the driver argument. nVidia drivers have many problems as well. It can quite often be the users fault as well, and ATi certianly seem to come out with new drivers fairly often and try to fix the problems, and their drivers aren't that bed, even Carmack said so.
And surely being able to lead the crown with an old process, is quite an achievement, their GPU is slower, and yet the card still performs as well as the 5800.
And no innovation? Meaning....?
I don't really see innovation that's really different, and if it is there, where's the use for it?

I for one am glad that nVdia is trying to change the way gpu's are made. I'm sure with the nv35, you all will be as well.

Changing the way GPU's are made? How? Have they stopped using transistors? Have they embedded RAM?
Basically, they have done things that ATi just delayed doing slightly. Sure, they hit 0.13 micron forst, but ATi wasn't far behind and it was the next obvious change.
Other "changes" are just the same as would have happened anyway, it's not so much of a change, it's just a progression.
AMD's Hammer is changing the ways processors are made, Intels Prescott is not.

I still think calling the nv30 a fiasco is a bit overstated don't you. It does everything it is supposed to do. It was just too forward thinking for its time.

Yeah, cos it came out 6 months after the 9700 and failed to really outperform it, and it was much hotter and louder, so it was, in pretty much all areas, a worse product, plus the 9700 was pretty much cheaper if you looked in the right places, so it was a bit of a fiasco, especially the fact it was continually delayed. The delays make it a fiasco.

It amazes me that consumers won't pay for 400 for a .13 card, with DDR II becuz of the cooling solution, but they will pay 400 for a 9700 Pro with ramped up core/mem speed, and some minor core revisions.[/quote]

Some of the shader pipelines in the FX may be innovative, and the first use of DDR2 may be a first, but it's not innovative.
And DDR 2 with a 128 bit mem bus is a little silly in some ways, since going with 256bit bus would have been more sensible.
So that adds to the flaws of the FX.

"NVIDIA makes up their performance advantages in their memory architecture, higher core clock speeds and overall efficiency"
Their mem architecture is nothing really special, ATi also has compression techniques, Hyper-Z III which woul doffer probably more than the 48GB/s bandwidth nVidia can claim they can peak at.
Efficiancy is not really true, something running with slower frequency, but equalling performance is surely more efficient, ala AMD/Intel.

"The end result of this compression engine is that anti-aliasing now becomes a very low cost operation, since very little memory bandwidth is wasted"
" Because of the compression engine, performance with AA enabled should be excellent on the GeForce FX."
"NVIDIA claims that their anisotropic filtering algorithm is more precise than ATI's, so the GeForce FX's anisotropic filtering should look just as good if not better than the Radeon 9700 Pro's."
"NVIDIA's FX Flow technology supports a wide range of speed levels to run the fan at; at its loudest the fan is no louder than a noisy Ti 4600."
A lot of the comments about the FX are not really all that accurate.
"The compression engine is completely invisible to the rest of the architecture and the software running on the GeForce FX, which is key to its success. It is this technology that truly sets the GeForce FX apart from the Radeon 9700 Pro."
That could be innovation, but it's not good innovation.

Was gonna add more, but can't be bothered for now, I'll just wait to be countered.
 

OpStar

Member
Apr 26, 2003
75
0
0
you are wrong though. The DDR II made it run hot, not the gpu. They are running the same gpu with a normal cooling solution in the nv35.
 

keitaro

Member
Jan 30, 2003
151
0
0
Originally posted by: OpStar
you are wrong though. The DDR II made it run hot, not the gpu. They are running the same gpu with a normal cooling solution in the nv35.

I disagree with you there. Memory is something that doesn't generate a lot of heat. Yes it'd be necessary to put in heatsinks on RAM chips today but consider the fact that the RAM chips themselves require only passive cooling. It doesn't generate enough heat that requires a giant fan to cool everything across the board. Look way back then when Pentiums needed active-cooling solutions where the RAM chips need nothing. Because memory don't generate a lot of heat when compared to a processor like a Pentium or GeForceFX.

So while the technology of memory changes and evolve, the amount of cooling, where necessary, that is needed is still much less than what is needed to cool a hot chip like the FX. We're not up to the point where we'd need active cooling across the board. Not yet anyway...
 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
Originally posted by: Jeriko
Just what I needed. Another card to make my decision even more gut wrenching.

I was all ready to order a 9800 Pro to go with all the other parts of my new system on the way, yet here I see this right around the corner.

If this is on the street in two weeks for $400, and I'd just bought a 9800 Pro for the same price, I'd probably gnaw through my wrist veins.

-J

Grab a 9500 pro for 175 and sit on it for sometime... It should hold you over until ATi's next offering.. or may-b even Nvidias..

 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
If you want to play doom 3 at 1280, 8x af, 6x aa, and with all details on highest then your best bet is to NOT buy a 9800 pro or nv35 and wait for the nv40 and r450

rogo
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Thought I remember seeing 38 fps for 1280x1024 with 4xFSAA and 8x Qual AF for the 5900 Ultra on Doom III. And considering it isn't a run and gun frag fest like the first Dooms and Quakes (its more of a 1st person Resident Evil), 38 fps could very well be playable, hehehe.