• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

debating to get radeon 9800 pro vs. Geforce FX5900 Ultra

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: VIAN

A tradeoff, of IQ for Speed. Sometimes we do it too, lower that AA so we can get those extra fps.

I wouldn't want IHVs making that decision for me. If I select 4xAA, I want 4xAA. If it's too slow, it should be up to me to disable it, not the IHV to try a "Folger's switch." You notice nV did it with 4xAA on the GF4MX, and they were caught. XGI did it with these settings, and they were caught. It's obviously not an unnoticable substitution.
 
A tradeoff, of IQ for Speed.
I've got a card you might be interested in. It renders every game as fast as your refresh rate, guaranteed. It does this by constantly rendering a totally white screen and this is acceptable because I've made an IQ vs speed tradeoff for you.

Sometimes we do it too, lower that AA so we can get those extra fps.
"We" is the not the same as a vendor forcing unacceptable image quality down our throats just to inflate benchmark results and make their product look a hundred times better than the garbage it really is.
 
What does IHV stand for?

Quote

--------------------------------------------------------------------------------
A tradeoff, of IQ for Speed.
--------------------------------------------------------------------------------


I've got a card you might be interested in. It renders every game as fast as your refresh rate, guaranteed. It does this by constantly rendering a totally white screen and this is acceptable because I've made an IQ vs speed tradeoff for you.
It's just like when 3dfx didn't have 32-bit support on the Voodoo3, it just wasn't playable at that high an image quality. Same thing here.

As far as having them do the trade-off, that is their problem, obviously we aren't stupid enough to buy a graphics card with only a knowledge that frame rates are high.
 
It's just like when 3dfx didn't have 32-bit support on the Voodoo3, it just wasn't playable at that high an image quality. Same thing here.
Actually it's nothing like that at all. 3dfx didn't magically gain 32 bit colour when you renamed Halo.exe to Hallo.exe.

XGI's stunts are nothing more that blatant and deceptive cheating.

obviously we aren't stupid enough to buy a graphics card with only a knowledge that frame rates are high.
And that hardly puts XGI in a good light does it? It's clear what their motives are vs what the consumer wants.
 
Quote

--------------------------------------------------------------------------------
It's just like when 3dfx didn't have 32-bit support on the Voodoo3, it just wasn't playable at that high an image quality. Same thing here.
--------------------------------------------------------------------------------


Actually it's nothing like that at all. 3dfx didn't magically gain 32 bit colour when you renamed Halo.exe to Hallo.exe.

XGI's stunts are nothing more that blatant and deceptive cheating.
This is like Nvidia app. detection on Splinter Cell, so that it will automatically disable MSAA because it doesn't work. Same here, frames are too low with regular quality so XGI app. detects it so that it can help give better frames.
 
No what's comical is your "nVidia can do no wrong" attitude and your "anyone who exposes nVidia's cheats is on ATi's payroll". You must be the only person in the world who thinks XGI isn't cheating and because of this there's little surprise that you don't think nVidia are cheating since their actions are more covert and less blatant than XGI's.

Are you listening to yourself here? I'm taking a nV can do no wrong attitude because I defend XGI against your unproven accusations? You are seriously losing it on this one.

What about Unwinder's findings?

You can break app specific optimisations if you actively try some times, not earth shattering news there.

FutureMark's?

Static clip planes I've never argued. They are at the point now where they have to reorder their shaders because nV's optimizations are API level.


A liar. Based on his own statements he is very easily proven so too(ignoring nV/ATi completely). Also, Gabe was paid millions of dollars by ATi to promote their parts, that isn't a secret either.

The websites that used Unwinder's detection and noticed a dramatic performance loss on most apps?

They broke the app specific optimizations. Again, take a look at what you are trying to disagree with me on instead of trying to convince yourself I'm not agreeing with you on all points.

The websites that used custom benchmarks and showed nVidia taking a nose-dive in games which they had previously dominated?

Almost all sites use custom benchmarks, not seeing your point here. I do recall one review where a site was able to show a lot lower numbers then most others, but with almost all sites now relying mainly on custom benches I'm not getting your point with this.

The websites that exposed nVidia's "UT2003.exe" and similar detection?

They use app detection, again a point I have not argued. Can you get that through your head yet? I don't know how many times I have to state it.

The BS "optimization guidelines" that nVidia produced which were broken by the very next set of drivers?

PR lies? How's that FBuffer working on ATi's parts? How are those 9600XTs being faster then the R9700Pro at everything? PR monkeys lie, be they from nVidia, from ATi or from other companies simply paid to lie(such as Valve).

Nope, there's no evidence of nVidia's cheating anywhere.

You circled around the fact that they are using app detection, not something that has been debated. It seems that in your mind your definition of preexisting knowledge means that they take into consideration the application and that in and of itself constitutes a cheat. You have drawn the preexisting phrase way out how it is used by any rational person. Look at the clip planes in 3DM2K3, that is an example of actual PEK being used to cheat. They knew exactly where the camera was going to draw and made a cheat that would eliminate everything that wasn't going to be seen on that exact path. That is using PEK to cheat.

Taking your argument to its logical conclusion we should blame ATi for cheating as they are launching an API level interface for their driver, that requires PEK of the given API and what happens if a game comes out that doesn't support either of the APIs ATi supports? Then the board will fall on its face. You can take PEK as far you would like, what you have not done is exhibited any sort of rational guidelines to encompass your singular mind on what exploitation of PEK crosses the line. Again, simply using an API requires PEK. You need to spell out explicitly what your definition encompasses if you want to have a rational line of discussion on the subject.

For XGI, I never stated that they are not cheating, I do however believe that we should have evolved past the dark age state of guilty until proven innocent. Based on everything I've seen their drivers are really, really bad. Worse then ATi's. They are trying to use app specific optimisations and have done an extremely poor job with it based on what I've seen. It is possible that they are cheating, but that assumes that they would be d@mn near retarded to assume that. There is no way people are not going to catch these issues.

Your line of thought on this is that nV is ripping off consumers which I think you may have even convinced yourself of but take a good look around, noone running a nV board cares because they are reaping the benefits of app specific optimisations. The people you see throwing a fit are those who cheerlead ATi and that's it.

Pete-

I wouldn't want IHVs making that decision for me. If I select 4xAA, I want 4xAA. If it's too slow, it should be up to me to disable it, not the IHV to try a "Folger's switch." You notice nV did it with 4xAA on the GF4MX, and they were caught. XGI did it with these settings, and they were caught. It's obviously not an unnoticable substitution.

Like how ATi got caught with their AF problems? What degree of forced reduction of IQ for speed do you consider acceptable(I personally think your board has eye gougingly poor AF- you still have the R9K right?)? Everyone does it, what do you think is OK?
 
Originally posted by: Genx87
The only problem so far is an occasional bsod at weird times when there is no heavy video usage.

Wow that sounds like quite a deal 🙂

Personally I would say if you can wait to do it until about April. You will be able to get much more of a card for the same price.

Or you could wait forever and never get anything at all because the next generation is ALWAYS 6 months away
 
This is like Nvidia app. detection on Splinter Cell, so that it will automatically disable MSAA because it doesn't work.
That's probably more of a compatibility fix than a performance fix. Also 3dfx made it clear right from the start that they didn't support 32 bit colour.

Same here, frames are too low with regular quality so XGI app. detects it so that it can help give better frames.
Right, so they should not only make it clear that horrendous quality is what the user can expect from this card and they should also not hide it by producing screenshots that make everything look how it should be.

And Ben, as I explained to you before I've finished with this pointless discussion with you. The thread you quoted was posted before I posted my last thread on the matter.
 
Also, Gabe was paid millions of dollars by ATi to promote their parts, that isn't a secret either.

Actually Valve, a company was paid millions to promote Ati's parts, just like how Activision was paid millions by Nvidia to promote their parts
 
Actually Valve, a company was paid millions to promote Ati's parts, just like how Activision was paid millions by Nvidia to promote their parts

Valve is mainly owned by a single person though, unlike Activision which is a public corporation.
 
Been through a lot of hardware combinations with both ati and nvidia. Haven't had serious problems with either one. Also, IMO, crashing is system instability, its not about drivers/video card compatibility.
 
Right, so they should not only make it clear that horrendous quality is what the user can expect from this card and they should also not hide it by producing screenshots that make everything look how it should be.

That's why, like I said before there are numerous reviewer sites.
 
Originally posted by: Nvidia Fanboy
You guys crack me up with this "performance" argument.
Both cards are so damn fast what does performance matter at this point? ATI 180fps and Nvidia 145 fps*. In what way is 180 superior to 145 in gaming? Negligible. 50fps to 30 fps is a much different story of course. But when the fps are so high now, you just cant use performance as a solid argument nowadays. Go ahead and try your DX9 argument now. Those games wont be out until after the next gen of cards anyway. You cant tell which has better IQ unless you have 2 PC's side by side running ATI and Nvidia respectively and both running the same game at the same settings. IMHO, all this is just a huge REACH in terms of "mine is better than yours".

Long Cool Mother, you can be assured of the following:

If you buy the 9800pro, you will have a 90/10 chance of trouble free installation and stability
If you buy the 5900U, you will have more like 50/50 chance of trouble free installation and stability.


Do a search for a recent poll run in this forum for Nvidia vs. ATI stability and problems and see the results for yourself.
From just users in this forum, they report Nvidia having a much, much larger percentage of issues.

Some users will say " I have a Tixxxx whatever and it has been 100% stable in everything I throw at it. Nary a hitch of problems, and that was with me upgrading from a Rxxxx whatever too."

They are ther very lucky few, believe me. I upgraded to Nvidia* from my 8500 and OMG what an ordeal to get it running HL2 decent. I had top notch mainstream motherboard/ram/power supply and the works as well.

Go with the (insert ATi card here) and be done with it. You will not need a bottle of Excedrin and a glass of water next to you at all times and you wont lose nearly as much hair because you wont be ripping it out. Your dentist wont make any money off you because you wont be grinding your teeth.

/solid advice

ATi Fanboy version
*Wait, so if I buy an NV based card, I will get 145 FPS? even in HL2?! and 180 on an ATi card! wow, why upgrade from my Quadro 2 EX!
Your argument about DX9 is just nonsensical. Why would anyone get a card that won't last them through the next generation of games? Especially when you're spending more than $200 on a card?

Because your arguments aren't backed up, they are easy to turn around. see above.




 
You guys crack me up with this "performance" argument.
Both cards are so damn fast what does performance matter at this point? ATI 180fps and Nvidia 145 fps*. In what way is 180 superior to 145 in gaming? Negligible. 50fps to 30 fps is a much different story of course. But when the fps are so high now, you just cant use performance as a solid argument nowadays. Go ahead and try your DX9 argument now. Those games wont be out until after the next gen of cards anyway. You cant tell which has better IQ unless you have 2 PC's side by side running ATI and Nvidia respectively and both running the same game at the same settings.
You can notice a difference between 180fps and 145fps. Later, when the card starts to show it's age, it will be the difference between 50fps and 30fps respectively. Even now, some high end DX9 cards perform that low. A Decent number of DX9 games are out now. IQ is similar on both ends of the stick. Compatability is very up in the air, to me they are both great cards and have similar compatability.
 
Also, IMO, crashing is system instability, its not about drivers/video card compatibility.

You running a R3x0 board? Install the Catalyst 3.2s and fire up any of the Half-Life powered games and start a game. Now try and enter the menu. See you after a reboot 😉

If you have a R2x0 board install the 3.8s, 3.9s, or 3.10s(the R9K was the most problematic IIRC), just try and sit down and play Halo for a few hours straight, you won't have to reboot for that one, the vid card driver combo will take care of it for you.

If you have Call of Duty, install the 3.9s running a R3x0 without hot fixing the drivers and try and play it for a few hours.

You are right saying that crashing is due to system instability, which can easily be cause by video card drivers. nVidia has its own bugs, I just don't know of any way to reproduce a hard lock bug using any nV part game/driver combo(I've tried).
 
NV bugs I've noticed.

Mafia Demo. Ti 4600 v43 or v44, not sure which. Polygon weirdness in the beginning cut scene.

COD Demo Single Player, not Dawnville v53.03. FX5900. Missing Textures.

Far Cry v53.03. Flickering 2 Textures in certain areas.

MOH:AA Spearhead FX5900 53.03 - Flickering Textures.

I think I also experienced another, but not sure what it was.

As of right now, I have had more problems with Nvidia than ATI. And if you coun't problems that weren't fixed. I haven't had any problems with ATI. Still with Nvidia.
 
That's why, like I said before there are numerous reviewer sites.
That has absolutely nothing to do with the issue. The issue here is XGI, their product and their attitude towards it.

You running a R3x0 board? Install the Catalyst 3.2s and fire up any of the Half-Life powered games and start a game. Now try and enter the menu. See you after a reboot 😉
Your running a Ti4xxx? Install Detonator 4xxx and start playing Serious Sam2. See you after the reboot.

nVidias has bugs too you know.

I just don't know of any way to reproduce a hard lock bug using any nV part game/driver combo(I've tried).
That means precisely nothing unless you're also prepared to admit that some ATi users who tell you they don't have crashing could also be telling the truth. That certainly hasn't been your position in the past.

It took me less five minutes to find this one in nVidia's list of fixes for 52.16:

GeForce4 MX 440, Windows XP: Blue-screen crash in Flight Simulator 2004 when display settings selected during a flight with 2x or Quincunx antialiasing enabled.

And it's not the only one listed there either. Here's another and it's also an open issue from the latest 53.03 drivers:
Quadro NVS, Windows XP: Driver causes blue-screen crash. A fix for this issue will be included in the next driver release.
 
Your running a Ti4xxx? Install Detonator 4xxx and start playing Serious Sam2. See you after the reboot.

What version of the 4x.xx? Tried the 41.04, 44.03 and the 45.23s and not a hitch. I did run in to an odd Z error that the game had issues with on a fresh install of the title, but then I recalled you could force the game to use full Z accuracy and that went away. So which driver is it that I need to be running?

nVidias has bugs too you know.

In my last post-

nVidia has its own bugs

Obviously you are telling me something I don't know there.....

That means precisely nothing unless you're also prepared to admit that some ATi users who tell you they don't have crashing could also be telling the truth.

I can bring up a post from one ATi user in ~the last week that said he didn't have hard lock issues in a game when he already started a thread a few weeks prior talking about having hard lock issues in the game.

GeForce4 MX 440, Windows XP: Blue-screen crash in Flight Simulator 2004 when display settings selected during a flight with 2x or Quincunx antialiasing enabled.

You know I don't have a GF4MX, I also don't have WindowsXP and I'm not going to go buy both to prove the bug exists. I'll take the statement that it does happen as fact, still not one I can see for myself.

And it's not the only one listed there either. Here's another and it's also an open issue from the latest 53.03 drivers:

Don't have a Quadro either. Right now I have a NV15, NV25 and I think I have a NV10 kicking around here somewhere. I have Win95, Win98, NT4 and Win2K(along with Mandrake, 7.2 IIRC) for OSs(probably pick up XPPro when I build my A64 rig). This isn't as simple as say, installing Cat 3.2s running any HL powered game and trying to go to the menu with any R3x0 based card.
 
Originally posted by: cindy22
LOL!
that's because doom 3 doesn't use directx9 and has to use lower precisions to win and most games will use directx9 not doom3's lower precision that's where ati shines , he also says that when you use the standards fragment like ati , nvidia would be much slower , that proves nvidia cards are much slower than ati when you compare apples to apples not apples to oranges!
His article also shows even though nvidia use mix mode in half life 2 which degrades image quality by using mostly directx 8 to improve performance nvidia still get stomped by ati,
he also says that in directx9 and future games nvidia will do poorly compared to ati,
doesn't that tell you something? It should tell you that nvidia has to use lower quality and sometimes cheats in other tests or games in order to win or barely catch up to ati !

There will be a ton of games using the Doom 3 engine by the end of 2005.

Half life 2 was just delayed AGAIN until september.

What're those other new DX9 games that are flying to the shelves now?
 
Originally posted by: Johnbear007
Originally posted by: Genx87
The only problem so far is an occasional bsod at weird times when there is no heavy video usage.

Wow that sounds like quite a deal 🙂

Personally I would say if you can wait to do it until about April. You will be able to get much more of a card for the same price.

Or you could wait forever and never get anything at all because the next generation is ALWAYS 6 months away

Theres a small difference between a new chip thats a little faster, and a revolutionary board design (from both companies) that they are claiming 100% performance increases with.
 
What version of the 4x.xx?
I don't remember exactly as it was two years ago or so. I think it was 44.35 or one of those.

I can bring up a post from one ATi user in ~the last week that said he didn't have hard lock issues in a game when he already started a thread a few weeks prior talking about having hard lock issues in the game.
I saw that one and I agree, some people lie or forget.

But others don't.

You know I don't have a GF4MX, I also don't have WindowsXP and I'm not going to go buy both to prove the bug exists.
I don't expect you to - if you say you've had no crashes then I believe you even though there are listed crashes in nVidia's driver list. All I'm saying is that there are people like you in the other camps.

This isn't as simple as say, installing Cat 3.2s running any HL powered game and trying to go to the menu with any R3x0 based card.
How simple it is is not only irrelevant but it's also non-constant. I mean it's not simple by any means for someone to get that problem if they don't have a R3xx, they don't have the 3.2s and/or they don't run HL games. OTOH it's very simple for someone who has a Quadro and is running Windows XP to get the problem.

In addition, not everyone had the HL problem with said configuration. Also the 3.2 drivers aren't even a factor as they're so old. Or should we start digging through readme files of older games which list nVidia problems? Or would you like me to list all of the issues in nVidia's latest two driver builds?
 
VIAN-

What machine are you running on Ben?

Currently-

MSI K7N2(very unimpressed- I've worked with numerous 8RDA+ boards and they put this one to shame as do all of the Asus NF2 boards I've used in builds, most of them cost less too)
Thunderbird 1.4GHZ(crushed my 2100XP@2.1GHZ being a stupid impatient moron 😱 )
BFG Ti4200 128MB(had a built by ATi R9500Pro, traded it for this and quite happy I did)
512MB Kingston PC2700
WD 80GB 7200
Maxtor 20GB 7200
Lite-On 16x DVD
HP 16x8x24 CDRW(only black CDRW drive I had kicking around, waiting for the DVD-RW market to stabilize a bit more)
CL SB512 PCI(complete and utter POS- CL's driver team should be throttled, daily..... for years on end), Morrowind had major issues with my Philips Seismic Edge and my Vortex boards are all fairly useless under Win2K, wish Bethesda would just fix the Morrowind engine to work with any sound card)
TT 420Watt PS(not horrible, but I wouldn't reccomend it)
TT XIII case

Moving to an Athlon64 based system(probably a 3K+ unless the prices move a bit or the higher rated parts have more OC potential), already have the cash waiting for it, waiting for the rest of the parts to launch(NF3-250 or an alternative A64 platform with proper OC support, PVR Series5 or NV40, SoundStorm based sound card or anything comparable that is not CreativeLabs) throwing in a new PS(Antec True550), moving to a GB of higher speed ram(likely OCZ 3500) and considering 2x120GB RAID configuration for the HDs. I was seriously considering going with a VapoChill cooling unit, but the wife talked me out of it(mainly because it wouldn't work with both the CPU and GPU without some serious modding, checked out the Inventiv unit also but I can deal with ~5F instead of ~-20F 😉 ). Pretty much I have an antique for my primary rig right now, getting tempted to say the hell with waiting(except on the vid card) and just build one based on the AMD chipset, but I'll kick myself in the @ss if it turns out the NF3-250 helps out the OCing a lot.

BFG-

I don't expect you to - if you say you've had no crashes then I believe you even though there are listed crashes in nVidia's driver list. All I'm saying is that there are people like you in the other camps.

I have no real way of knowing that. If you run in to problems I know you'll state them, same with oldfart, but there aren't a lot of people in the ATi camp that will.

I mean it's not simple by any means for someone to get that problem if they don't have a R3xx, they don't have the 3.2s and/or they don't run HL games.

I was bringing it up as it is very easily reproduced by the majority of ATi enthusiasts, there isn't anything comparable on the nV side.
 
Back
Top