Geforce FX 5900 Ultra VS Radeon 9800 Pro 256 mb

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rogozhin

Senior member
Mar 25, 2003
483
0
0
I sure won't be picking up either of those cards for at least 3 months. Why would you if you own a 9700 pro or nv30 and you aren't running a 3.06 ghz p4.

there is no need for most since we are cpu limited anyways.

rogo
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Rogozhin

I have no problems with splinter cell on my 9700, shadows run at highest and they work, check out this url for the 9800 pro pulling ahead of the 5900 in THAT game-http://www.hardocp.com/article.html?art=NDcyLDEw[/q
What's wrong with this picture?. The problem is even worse if you enable AA as backlighting will shine through foreground textures improperly. Its also well-documented that the R3XX don't support all of the dynamic lighting effects in SC. If this is an issue for the R3XX-based cards now, I can't see the situation improving in the future (DooM3, HL2, etc.).

Anand, hard, digextreme, b3d all state that ati's drivers are ROCK solid, so it's hard to take what you say as viable when the major sites and myself are concluding that ati's drivers are very strong therefore compatability too.
A review is a snapshot at any given time. Sure, their drivers might be fine as of today, but how about when the particular game was released? I don't consider reactive driver fixes to be an indication of solid drivers, I call it a band-aid. The number of game compatibility issues with the R3XX that are either ongoing or fixed in subsequent driver revisions is well-documented at Rage3D. On the other hand, I've seen more and more games recommending nVidia-based GPUs along with the "Way its Meant to Be Played" logo. SC and DooM3 look to further emphasize this going forward. I'm currently testing/playing PlanetSide which is set for release next week that suffers from hard locks on R3XX based cards. There's still no fix for the problem that seems to be caused by particle effects and dynamic lighting...hmmm. The developers and the game specifications recommend nVidia-based GPUs and it too will feature the "Way its Meant to Be Played" logo. I'm sure I could wait 2 or 3 months for a fix by ATi, but I've spent too much time waiting in frustration as a result of ATi cards........

Chiz

Oh yah, make sure you pick up your WHQL FX drivers. Told you FP24 wasn't an issue. :p
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: chizow

The R3XX has numerous issues with Splinter Cell (lighting, shadows, AA, etc.) which is arguably the best showcase ????
Nvidia's been having their share of ?rendering problems? these days too. Even with the latest drivers the 5200 is not rendering the water properly?..

tech report


 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
I believe that splinter cell's shadows and fuzz around the lights aren't being rendered on the nvidia drivers (check hardocp).

"SC and DooM3 look to further emphasize this going forward. I'm currently testing/playing PlanetSide which is set for release next week that suffers from hard locks on R3XX based cards. There's still no fix for the problem that seems to be caused by particle effects and dynamic lighting...hmmm. The developers and the game specifications recommend nVidia-based GPUs and it too will feature the "Way its Meant to Be Played" logo. I'm sure I could wait 2 or 3 months for a fix by ATi, but I've spent too much time waiting in frustration as a result of ATi cards........"

This is all unproven.

The SC AA bug happens on nvidia hardware too, go check tech forums at splintercell.com.

And what do you think you're going to find at rage3d.com chitz? It's an ati website so the bugs listed there are only occuring on those users that own ati hardware, checknvnews (which is about 85% smaller than rage), or just go to the games tech sites to find out that most of the bugs occur accross the board.

The planetside question is still unfounded until it releases as well as doom3, remember nvidia and futuremark 2003?
Without unoptimized drivers they scored very poorly, same situation with the BETA doom3.

Cat 3.4s are out tommorow and the det whql 43s are out today-about time since it's been 6 months since a nvidia whql release.

Rogo



 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
also

along the same lines as chitz's logic.

I have seen a few games with the ati's meant to be played logo raven shield and HL2, I guess their drivers are the best around.

c'mon chiz, I know you're an intelligent fella ;)

rogo
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: touchmyichi
Has anyone here heard of how the NV 30 overclocks yet? So far on all of the reviews I've seen they haven't gone into overclocking the cards.
Apparently very well. Two people at Beyond3D set their drivers to OC automatically, and they were boosted to about 500/1000. Makes you wonder if the FX Flow was even necessary....

chizow: Oddly enough, nVidia had those same SC rendering issues before, as previous [ H ] reviews detail. I hope ATi isn't regressing with their drivers, though. :frown:

As for the rest of this fanman/boy quotefest, haven't you all had enough of this sort of bickering? ;) For the record, I think the 5900U looks like an excellent card. Heck, it may just be fast enough at 16x12 w/4xAA to make it worth paying an extra $100 for the 256MB version, where the extra memory will become useful. But the real question for me is when the 128MB version comes out, and what will it be clocked at? I'm most interested in the Ti200/4200 of the 5900 series, the mainstream, lower-clocked one due a few months from now. That could prove very tempting for those who want NV35 performance on the relatively cheap, especially given how well the 5800 OC's using a similar HSF.

Edit: Apparently [ H ] without the spaces is a cue for block highlighting, like so. :)
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Rogozhin
I believe that splinter cell's shadows and fuzz around the lights aren't being rendered on the nvidia drivers (check hardocp).
I've never seen or heard of the issue. Got a link to any pictures? I'm not even sure what you're referring to, but my GF3 Ti200 renders much closer to the SC reference (XBox of course).

"SC and DooM3 look to further emphasize this going forward. I'm currently testing/playing PlanetSide which is set for release next week that suffers from hard locks on R3XX based cards. There's still no fix for the problem that seems to be caused by particle effects and dynamic lighting...hmmm. The developers and the game specifications recommend nVidia-based GPUs and it too will feature the "Way its Meant to Be Played" logo. I'm sure I could wait 2 or 3 months for a fix by ATi, but I've spent too much time waiting in frustration as a result of ATi cards........"

This is all unproven.
Except for the FACT Saturday's update specifically addressed 6) Fixed a graphics lockup on the ATI 9700 card. Since I'm guessing you're in the Beta, you should have no problem accessing the patch notes for May 10th.. The issue still hasn't been resolved, and a new thread was started by TwistPS (a SOE dev.) asking for hard lock specifications. The thread is already over 120 posts, with at least 95% of the posters using ATi cards, and 80% using R3XX based cards. The few nVidia posters don't seem to know the difference between a hard lock and a CTD. Linking to the forum won't do any good, but if you head to the "stability issues" thread, you should have no problem finding it at the top. You'll also see plenty of threads where the devs. acknowledge the game has less issues that need fixing with nVidia cards.

The SC AA bug happens on nvidia hardware too, go check tech forums at splintercell.com.
I surfed the first 7 pages and found nothing on AA at all. I have however seen the same results on *my* monitor as well as a few review sites. I've also attempted to reproduce the problem at a whole 8-14 fps on my GF3 Ti200 w/out success. I don't own an FX (yet), so I'm relying on the reports from reviewers and have found no mention of AA issues on nVidia cards. Again, any specific links or pictures would be helpful.

And what do you think you're going to find at rage3d.com chitz? It's an ati website so the bugs listed there are only occuring on those users that own ati hardware, checknvnews (which is about 85% smaller than rage), or just go to the games tech sites to find out that most of the bugs occur accross the board.
I only mentioned Rage because its a good archive of documented issues with ATi cards. I wouldn't bother to mention certain problems if I didn't experience them first-hand. BF1942, one of the most popular games as of late had recurring issues up until Cat 3.2 for most people including myself. Many other popular games suffered from "D3D stutter" and background application noise for months waiting for driver fixes. There are bugs that are mostly an annoyance and there are hardware-specific game breakers. Neither are pleasant, but hell, I'd rather deal with an off-blue shade of water in a poorly designed sythetic benchmark over a piercing light-ray tattoo on the back of Sam Fisher's head or a complete hard lock that requires a reboot in actual games.

The planetside question is still unfounded until it releases as well as doom3, remember nvidia and futuremark 2003?
Without unoptimized drivers they scored very poorly, same situation with the BETA doom3.
How is the PlanetSide question unfounded? Its an issue as of right now; again, a reactive band-aid from ATi may very well stop the bleeding, but its still a very real problem today. Don't even bring up 3DMark2K3, its a poorly designed benchmark that had its flaws exposed. You've already been proven wrong about the FX/cheats/FP16 vs. FP24 and WHQL drivers, as Anand's explanation of DX 9 spec explains it quite well. Also, the previews of DooM3 are clear when they say the test set-ups do not favor either card. In one of the reviews, Carmack mentions the tested build is near-final in terms of actual performance. ATi may be able to improve their driver performance in DooM3, but hell, they've had the alpha and the hardware to run it for nearly a year. You'd think they woulda been ready for it. Guess they had too much on their plate to worry about a game due-out a year later (which means 16 months to get it fixed in ATi time).

Cat 3.4s are out tommorow and the det whql 43s are out today-about time since it's been 6 months since a nvidia whql release.
Yep, Cat 3.4 is pretty much the last shot at getting this PlanetSide issue fixed before it goes Live next Tuesday, which is why I asked them to keep the Beta open a few days longer. If not, I'll be hitting the reset button a lot until next month and NV35. There's no need for WHQL drivers every 4-6 weeks when your new products have just hit market and your older products are mature and problem-free. OTOH, ATi card holders look forward to new bug fixes across the board with every release. I'm sure a lot of CS players will be anticipating 3.4; I'm looking forward to it myself for the R350 path enhancements as well as a possible fix to PlanetSide hard locks.

Chiz

 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
Chiz

Here is a link to nvnews.net forum where almost all of the posters had the problem, a few of them own the geforce fx, others own 9700s and geforce 4s.

http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=9923&highlight=Splinter+Cell

http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=9098&highlight=Splinter+Cell


Here is a ubi programmer telling the masses that sc doesn't support AA and won't until patch 2.0 c0mes out.

http://forums.ubi.com/messages/message_view-topic.asp?name=pcdemo&id=zzlav

The planet side mess is something that will hopefully be fixed by the release or shortly after.

Rogo
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Weirdly enough, I swear I remember [ H ] showing pictures of rendering errors in Splinter Cell (incorrect lighting, textures), but I can't find them in recent reviews. Perhaps I have to go back a little further.

Anyway, it seems ATi isn't the only one with less-than-pristine drivers. Check Ante P's pic here of GTA VC (middle of page).

Check this topic on 3DM03 'sub-pixel jittering": NV3X subpixel precision in 3DMark03.

And finally, check out the plain-spoken headline in this new ET article: nVidia Appears to Be Cutting corners on 3DMark 2003: Driver Irregularities May Inflate nVidia Benchmark Scores. (Perhaps they changed the article title to something less inflammatory, but left the original title in the HTML header. :) ) I'll add one of ET's opening paragraphs, for objectivity's sake (though the pictures tell their own story):
Our own interpretation of these test results is that nVidia is improperly cutting corners to attempt to inflate its 3DMark2003 scores for the new GeForceFX 5900 Ultra. The company, on the other hand, believes that the problems can be attributed to a driver bug. Let's explore exactly what we found, and you can draw your own conclusions.

Anyone find anything remotely ironic--or, more specifically, deceiptful--in nVidia's initial media barrage against 3DM03, claiming it would be a waste of precious resources to divert them to optimizing for 3DM03, and yet here we see what Dave Salvator describes as "manually designed and implemented for each frame of animation."

I applaud their decisive wins in Doom 3, although the test seems to have caught ATi unaware, and thus I believe it wasn't right for Anand and [ H ] to bench D3 with the 9800P. I decry their apparently continued underhanded attempts to win all benchmarks at all costs. Perhaps I speak too soon, and this is an honest mistake--but nV's behavior since the NV30's "release" last November leads me to expect the worst.
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
Peter

I too saw those rendering errors in sc with the fx but can't seem to find the link, I'll continue looking.

rogo
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
As i was digging through urls I just found this posted at extremetech.com

They have unearthed a "cheat" used by nvidia to gain higher 3dmark2003 scores in the 44.03 dets.

Nvidia is attibuting this to a "bug" in the driver-I myself find this hard to believe and now realize that the doom3 beta benchmarks might be rendered using these same culling and clipping hacks hardwired into the driver itself.

This is just bizarre.

Rogo
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Some cases??? As Rollo has so elquently shown, the FX5900 Ultra has won most tests.
Most but I wouldn't call it a white-wash. Yes, the 5900 is faster but I wouldn't say it obliterates it, especially when you factor in image quality.

Anand has also show that image quality is practically even and one would be hard pressed to tell the difference.
That is simply untrue. For one if you look at FSAA screenshots you can clearly see ATi is superior, eg HardOCP.

As for anisotorpic filtering, with potentially double the sampling rate ATi's quality method will almost always be superior to nVidia's.

My question to you is, why did Anand run both cards at 8x when the Radeon can do 16x and by his own screenshots 16x looks superior?

nVidia's 8x Quality.
ATi's 16x Quality.

In fact ATi's 16x performance looks just as good as quality.

The difference is apparent right at the back edge of the picture and it means that ATi's method stays sharper for longer distances during actual gameplay.
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
Chiz

Here is ubi's response to all the AA complaints of lights shining through wall and their solution.

"Question
While playing Splinter Cell, the light coming from windows seem to cover everything on the screen. The light shines through walls and can been seen through Sam's body. What is causing this problem?

Answer
This seems to be a problem with certain video card Antialiasing settings. This problem can occur with certain Geforce and Radeon models and can easily be resolved by disabling Antialiasing on your video card. To do this:

1. Right-click on your desktop and choose Properties.
2. Click on the Settings Tab, and then click the Advanced button in the lower left corner.
3. Click on the tab that has the name of your video card to bring up the advanced properties for that card.
3. Under the Direct3D settings, find the Antialiasing settings and set them to 'off' or turn them down all the way.
4. Click Apply and OK to save your changes.

Next time you load Splinter Cell, you should not have this problem anymore."

this is from ubi.com, the url is huge so i won't post it unless asked




rogo
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Rogo:
Rollo I don't quite understand what you're saying in that last post.
I'm saying that quoting Anand as saying 5900s aren't worth the money is like quoting Hugh Hefner saying hanging out with Playmates is no big deal.
Anand HAS a 5900, so it's easy for him to say it's not worth the money. Anand obviously loves hardware though, so I bet if he didn't have one, he'd buy one.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Most but I wouldn't call it a white-wash. Yes, the 5900 is faster but I wouldn't say it obliterates it, especially when you factor in image quality.

BFG! You "always get the best cards"! Everyone says so. Does this mean you won't be going all 5900 with me and arguing which settings I should run it at?

BTW- my 9700 Pro sits in a box next to me. I got to use it again for a whole day. The 5900 reviews prompted me to sell while the selling is good, I'm using my 3 year olds Ti4200 till then. (he's reduced to the onboard GF2 MX, which runs his Sesame Street and Blues Clues fine)
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
rollo

I'd would respond with my belief that most good reviewers consider the prices of the cards when they recomend them. Old Hugh might not realize that the playmates would knock most men to their knees with envy but he sure as hell knows what their WORTH.

Rogo
 

Jmmsbnd007

Diamond Member
May 29, 2002
3,286
0
0
Originally posted by: CheapTOFU
oh~ if you got $500, and want to play games...., buy Xbox, PS2, and game cube.. total=about $500
Have fun getting Half-Life 2 and Doom 3 on those consoles any time soon.
 

Jmmsbnd007

Diamond Member
May 29, 2002
3,286
0
0
Originally posted by: Frek
HUH???? (in response to Rollo's posts)

DOOM III isn't even going to be out until fall and your already declaring the 5900 to be "THE" card for it. I'll give you this, you've got a lot of balls to make a statement like this knowing that ATI hasn't wasted any resources on optimizing drivers for a game thats 6 months out. Add to that there will be a new ATI card out BEFORE DIII.

I just really don't see how you can make a statement like the 5900 will be "the" card for DOOM III when you haven't even seen what ATI is going to do with DOOM III yet.

My prediction: You'll be eating those words by the time DIII does come out. I give at least 80% odds that ATI will have the top benchmarking card for DIII when it actually comes out.

Also the actual reviews of the 5900 do not say that the 5900 OWNS the 9800(scroll up to the post number 1 and see what I'm talking about). What they do say is that its too close to call the 5900 vs. 9800. All the reviews I have read all say the same thing, its too close to call because the benchmarks flip back and forth between the 5900 and the 9800. The 5900 by no means OWNS the 9800. From your posts Rollo I get the impression you think the 5900 totally dominates the 9800. Better go back and re-read the reviews but this time take off your nivdia-colored glasses first.

If I'm being honest I would say that the 5900 looks like a really sweet card, but I would not go as far as Rollo does and declare it to OWN all Raedons. In my opinion(as well as Anandtech) its way too close to call.

Will I be buying a 5900? Hell no, my 9700 Pro plays every game I have at INSANE frame rates. I'll be upgrading my entire system just before DIII is released. In all likelihood that will be an ATI 9900 Pro (or whatever they are gonna call their next gen card). On the very small chance that ATI does nothing between now and DIII (which just isn't gonna happen) I'll pick up the 5900.
ATI hasn't even announced their next card. And yes, the 5900 owns the 9800.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Does this mean you won't be going all 5900 with me and arguing which settings I should run it at?
Probably not. For the first time as long as I can remember I'm more than happy with my current card to skip a generation. Yes the 5900 is faster but I don't feel the gain is worth the cost, especially since the image quality doesn't seem to hold a candle to my 9700 Pro.

I'm quite happy to wait for the R360 (which is expected in July anyway) and beyond to see how the R400 turns out. Also until nVidia gains the ability to do 16x anisotropic filtering I'll likely be sticking with ATi from now on as I simply can't go back to using anything inferior.

I hope you enjoy your new card and I'd be eager to hear of your experiences with it when you get it. :)
 

Rogozhin

Senior member
Mar 25, 2003
483
0
0
Well said bfg10k

I see no need for an upgade in my vga department since I'm cpu limited with the 9700pro with my 2100+tbred @ 1.9

I too will be interested in your impressions of the 5900 rollo.

I don't see any sites saying that the 5900 owns the 9800, where did you pick that one up big guy?

rogo

 

Wurrmm

Senior member
Feb 18, 2003
428
0
0
Originally posted by: BFG10K
That is simply untrue. For one if you look at FSAA screenshots you can clearly see ATi is superior, eg HardOCP.

Thanks BFG10k!!! That AA IQ stuff is definately good to see. I have not upgraded my GPU for some time and I am still running a Ti500 and even though the IQ is noticeable when I take a close look, it is not noticable enough to make a differance to me. Maybe Nvidia will do stuff in future drivers that makes it better like the AS stuff. If ATI releases something faster than the 9800 pro sometime around the release of the FX5900I I will look into it. I like a good blend of speed and IQ and I trust Nvidia and don't really feel like making the jump.
 

NicColt

Diamond Member
Jul 23, 2000
4,362
0
71
Originally posted by: Rogozhin
As i was digging through urls I just found this posted at extremetech.com

They have unearthed a "cheat" used by nvidia to gain higher 3dmark2003 scores in the 44.03 dets.
Nvidia is attibuting this to a "bug" in the driver-I myself find this hard to believe and now realize that the doom3 beta benchmarks might be rendered using these same culling and clipping hacks hardwired into the driver itself.This is just bizarre.Rogo

well well well, isn't that something..... here's one fo the comments

"considering the fact that DOOM 3 scores might be fixed for a number of reasons (maybe revenge since ATI people released the Doom3 code on the net"

and

"nVIDIA said flat out, that the NV35 was built for Doom III. So they are making no bones about it"

then perhaps Doom III is being built around the NV35.... developing