nVidia's Problems

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JJN

Member
Dec 28, 2003
48
0
0
Originally posted by: SilverTrine
Originally posted by: Insomniak
Originally posted by: SilverTrine

I hope ATi sues you for defamation, trust me 3dmark2003 is not detected by ATi drivers and is by Nvidia. We can say 100% that Nvidia cheats and ATi doesnt.



lmao....fanboy much?

Truth hurts doesnt it sparky. Nvidia is a cheating dog of a company who continues to cheat to this very day. ATi does not cheat at all this is verfied by 3dmark2003.
I'm not into debating morons you either accept reality or you dont, I'm not going to be like BFG and spend hours point for pointing trolls when the above is 100% truth.



Silvertrine,
How do you explain ATI's actions in the past with the 8500? Just like you are anti-nvidia, probably to the point where you wouldn't buy their next generation card, many people are just as burned up by some of ATI's hijinks.

Innocent lil ATI
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: SilverTrine
Truth hurts doesnt it sparky. Nvidia is a cheating dog of a company who continues to cheat to this very day. ATi does not cheat at all this is verfied by 3dmark2003.
I'm not into debating morons you either accept reality or you dont, I'm not going to be like BFG and spend hours point for pointing trolls when the above is 100% truth.


lmao.....fanboy much?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nvidia also stated that their drivers were intelligently culling the 3dmark scene as to not render what isn't being seen, surely if you're smart enough to not believe this without some proof, you would be smart enough to not believe everything nvidia says about the pixel shader test

I'm talking about driver level shader compilers. If you are smart enough to auto equate driver level shader compilers with cheats then you should know that ATi is now using driver level shader compilers also....oops :eek:
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: JJN
Originally posted by: SilverTrine
Originally posted by: Insomniak
Originally posted by: SilverTrine

I hope ATi sues you for defamation, trust me 3dmark2003 is not detected by ATi drivers and is by Nvidia. We can say 100% that Nvidia cheats and ATi doesnt.



lmao....fanboy much?

Truth hurts doesnt it sparky. Nvidia is a cheating dog of a company who continues to cheat to this very day. ATi does not cheat at all this is verfied by 3dmark2003.
I'm not into debating morons you either accept reality or you dont, I'm not going to be like BFG and spend hours point for pointing trolls when the above is 100% truth.



Silvertrine,
How do you explain ATI's actions in the past with the 8500? Just like you are anti-nvidia, probably to the point where you wouldn't buy their next generation card, many people are just as burned up by some of ATI's hijinks.

Innocent lil ATI

ati explained that issue in the link you provided:



Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them. The current RADEON 8500 driver revision has an issue that prevents it from correctly interpreting the texture quality slider setting in Quake 3. This issue will be corrected in the next driver release.
Note that the texture quality setting is just one of many possible ways that users can increase image quality in Quake 3 at the expense of performance; forcing on anisotropic filtering or disabling texture compression are alternative methods. It is also important to note that the image quality obtained using all of the standard image settings in Quake 3 (fastest, fast, normal, and high quality) can be observed to be equal to or better than any competing product (try it!); it is only in the special case of selecting "high quality" AND turning the texture quality slider up to the highest setting that a potential discrepancy appears.

furthermore, it only effected like 4 or 5 textures in the whole game and if i recall correctly only 1 of those textures was in one benchmark map. so if they really were trying to cheat us by raiseing their bnechmark score they sure did a pittful job of it. not to mention the fact that the textures were so blury it was blatently obvious, and their next drivers fixed the issue without the pefromace droping. together that makes it pretty clear that it was just a bug.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them.

ATi publicly coming out and boasting about their 'cheating'.... hehe. Sorry but, all this stupid BS about application optimizations I never even thought to just pull up quotes from ATi stating they were doing it themselves(despite the denial by so many in the faith).
 

Fraggster

Member
Oct 3, 2003
183
0
0
Originally posted by: CaiNaM
Originally posted by: BenSkywalker
Alright let's get all of nV's issues grouped together-

Pixel Shader 2.0 performance is horrible
but improving...
Missing some of the PS 2.0 level features in their drivers, HDR as an example which is going to be used by HL2
aye, agreed.. what' up with that?
Driver bugs completely screwing up the fog in several titles(MOH as an example)
hmmm.. haven't noticed that, but i don't play MOH either
AA quality @ 4x and higher sucks compared to ATi
i disagree there. there are some subyle differences i've noticed under close scrtutiny in ati's favor, but honestly, overall the differences to me are either unnoticeable or negligible.
Gets their @ss kicked in some games embarassingly so(Mafia as an example)
no personal experience.
Dropping AF quality compared to the NV2X line while ATi is making huge gains
a tradeoff for performance that may offend some more than others. i still prefer nv af over ati.
'Brilinear' hack to improve performance
see previous comment.
Cheats in 3DMark2K3 with static clip plains with older drivers
not that i ever put much into 3dm other than it being a pretty demo, but hey, i raised hell with ati's q3 "cheat" awhile back, so gotta be fair and slam nv here.
Per app optimizations can create confusion on how the boards will perform across the board
only nv does this?
Am I missing any here? Are there any that nV users disagree with? It would be nice if we could just get all of this out so it doesn't have to be repeated ad nauseam.

i had a 9700p. great card imo (and still great even today), especially when considering the fiasco which was the 58xx series of nv cards. when i saw a 5900nu for $170, i couldn't resist and i picked one up. spent alot of time with it over the last 25 days, as it's in may main pc (one of 3 in my work area) & my office is at home do i spend 12-16 hrs almost every day in front of it. i thought it was a good card and a great value.

the 9800p was available locall this weekend for $199, and having several improvements over the 9700p (now residing happily in my wife's pc), i picked one up.

ran some ut demo benchmarks with the 5900, then removed it and installed the 9800p using cat 4.2 and ran some ut demo benchies on it, along with a bunch of screen catpures. i had some issues with daoc, and realizing i had neglected to take screenies w/ the 5900, out came the 9800 and the 5900 went back in.

hmm.. daoc issues went away..

you know, i know i'll get a lot of crap for this, but after throwing the 5900, it dawned on me that i preferred that card over the ati. the 2d was crisper, the 3d colors were more vibrant. performance difference was neglible between the 2, some games favoring the nv a bit, and some games favoring the ati.. and some games it went back and forth depending on the resolution.

maybe some of these issues were related to the new catalyst drivers (i had run 4.1 briefly before i replaced the 9700p w/ that 5900, but i WAS going to go back to the 3.10s actually, not thinking the 4.1s were that great), maybe not. i'll have to put the 9800p back in and fiddle with it some, but having the 5900 back in my main rig, i'm kind of liking it....

bow down b4 you as you speek teh truth:D
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them.

ATi publicly coming out and boasting about their 'cheating'.... hehe. Sorry but, all this stupid BS about application optimizations I never even thought to just pull up quotes from ATi stating they were doing it themselves(despite the denial by so many in the faith).

well it is pretty silly to base your argument on two year old statments when things have changed dramatically sence then. ;)
 

JJN

Member
Dec 28, 2003
48
0
0
Yeah that article was old, but the point is no one is innocnet. ATI holds a lead these days but its small. I'm not a programmer, code writer or anything like that but I do build computers and swap out hardware and most importantly I'm a GAMER. Someone who has been a top player in the world playing RTS's and now I'm playing BF1942 and CoD. I've been to plenty of LAN parties and played games with hundreds of people in there teens/low 20's. For me the proof will always be in the pudding. Some people like to get on here and trash nvidia and make it sound like there cards are total crap, well I've seen soo many FX5900, 5950, R9700's, 9800's and I can't even tell them apart during gameplay. There's no friggen way that anyone can tell the difference between 10 FPS while playing a game, and thats about all the difference ever is. To me the difference between a 9800 Pro and the FX5900XT I currently have in my system isn't worth the time to change cards and drivers. But hey what do I know, I only play games!
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Compared to what it was expected to do it was a total flop. Nintendo basically owned the market until then. And Sony, with a brand new model which offered worse graphics totall dominated the N64.
Saturn may have kicked ass in Japan but I dont know a single person who owned one.

PS2 may have been harder to program for but it was riding the wave of the PS1 and had backwards compatibility with PS1 games. Devs were willing to obviously wait until the tools were ready for the PS2.
I don't think 40million is that much of a total flop compared to PSX's 60million. The graphics weren't that much worse, it was more pleasant to look at then the N64 and had double the polygon power. I used to know a handful of people that owned Saturn's. It was a fun machine. I borrowed it once. I don't like the backwards compatability of the PS2 because it makes the previous console useless. Why don't you just collect all the games and then buy the last console Sony makes, you could save yourself a lot of money. It's extra cost that is worthless, along with the DVD player.

As for Shenmue II, well if you're at the part where Joy is telling you to stay at the guest house, then you haven't even really started the game IMO. There is so much to do to finish the game from where you are. On Dreamcast you'd still be on Disc 1.
Are you sure? I already put a couple of hours into it. It is after I stay with master Tao. After I had to catch the falling red leaves with my fingers. After I found That dude's martial arts book in the man mon temple library.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Thanks for the sticky Evan :D

well it is pretty silly to base your argument on two year old statments when things have changed dramatically sence then. ;)

Feeling guilty you just blew months of BFG's defense out of the water with a post that was supposed to bolster his position? :p :D

I don't know why I didn't think to just pull up a quote of them stating the fact that they did it, would have saved a lot of time and effort.

I'm running a R9800Pro and if ATi is optimizing on an app specific basis all the power in the world to them. If they could come up with a couple of per app tricks for KoTOR to get soft shadows working(along with working around the odd AA+AF bug in the game and fix the Hydravision crash bug they currently have) and a Halo one to enable the proper render state(GearBox has it screwed up right now for the cloaking device) I'll give them a big pat on the back. Any performance in apps they pull out is also more then welcome, even if its only in the apps sites like to benchmark, better then no gains. Hey, they want to spend a few weeks tweaking the hell out of their drivers for FarCry feel free, it will be benched widely I'm certain, and I would greatly appreciate a performance boost there :)

There is nothing wrong in any way with doing app specific optimizations if you are rendering everything properly(you pay a lot extra to get a pro board in no small part precisely for that reason), BFG may even see the light of this with ATi's admission that they do it and like nV, don't give the users the option to override it. You can say it creates a problem in terms of public perception, which it does, but at least with ATi not being as talkative about it as nVidia is it creates less confusion on their part(although nV was rather forced to talking about it more due to their blatant static clip planes cheat).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
the cloaking device in halo is screwed up on nvidia hardware, it has always worked just fine for ati's dx9 cards. as for KotOR, bioware has always been crappy at supporting ati and it doesn't seem like that is going to change no matter what happens.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
And what about those apps that don't have application detection? Who wants to support a vendor that is required to constantly hard-code their drivers just so games work? A vendor like that is flaky at best.

Just to get it to work? hmm seems like a lot of new games come out and ATI is releasing hotfixes for them :)

Flaky I guess hehe
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: VIAN
Compared to what it was expected to do it was a total flop. Nintendo basically owned the market until then. And Sony, with a brand new model which offered worse graphics totall dominated the N64.
Saturn may have kicked ass in Japan but I dont know a single person who owned one.

PS2 may have been harder to program for but it was riding the wave of the PS1 and had backwards compatibility with PS1 games. Devs were willing to obviously wait until the tools were ready for the PS2.
I don't think 40million is that much of a total flop compared to PSX's 60million. The graphics weren't that much worse, it was more pleasant to look at then the N64 and had double the polygon power. I used to know a handful of people that owned Saturn's. It was a fun machine. I borrowed it once. I don't like the backwards compatability of the PS2 because it makes the previous console useless. Why don't you just collect all the games and then buy the last console Sony makes, you could save yourself a lot of money. It's extra cost that is worthless, along with the DVD player.

As for Shenmue II, well if you're at the part where Joy is telling you to stay at the guest house, then you haven't even really started the game IMO. There is so much to do to finish the game from where you are. On Dreamcast you'd still be on Disc 1.
Are you sure? I already put a couple of hours into it. It is after I stay with master Tao. After I had to catch the falling red leaves with my fingers. After I found That dude's martial arts book in the man mon temple library.


I think I got mixed about where you are in the game. Not everybody gets the same experiences. Well, I'd say you're on the second disc on the DC version right now (I still measure the game that way). You're getting pretty close to going to Kowloon which is disc 3 on DC. Kowloon is huge and there is a lot to do over there. The action gets really intense here and you're going to need some money to get through some parts. In fact the game just becomes awesome on disc 3 with some insane fights, and a taste of the Hong Kong underworld. The game also has some amazing indoor sequences with beautiful shopping centers and stores. I really hope you get your Xbox fixed. If I were you, I'd get that $50 extended warranty from M$ and then finish playing Shenmue II and sell that useless system. You could also fish out your old Dreamcast (or buy one for $30) and then try to get the Dreamcast import (which sells for $50+). But you sell your copy of Shenmue II for as much as you bought it, since there are diehard fans who want to play it only on Dreamcast (they hate the english voice dubbing on the Xbox version - personally I don't care). Either way you'll want to beg, borrow, or steal to get it.
 

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: Evan Lieb
BenSkywalker,

You get a sticky for effort.

rolleye.gif
thanks, now I can take bens unrelenting attitude and opinions on this issue and now instantly associate it with AT reviwers.
 

First

Lifer
Jun 3, 2002
10,518
271
136
Originally posted by: reever
Originally posted by: Evan Lieb
BenSkywalker,

You get a sticky for effort.

rolleye.gif
thanks, now I can take bens unrelenting attitude and opinions on this issue and now instantly associate it with AT reviwers.

...or you could take it as a way for the mods to filter out ATI v. NVIDIA spam.
rolleye.gif
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
And what supports that assertion?
Didn't you just state that they had driver issues that were cured by application detection? So what about the next batch of games that hadn't been detected by the drivers yet? And the next ones? And the ones after that? Or do think Quake III and UT detections automatically help Halo and Splinter Cell?

You mean catching the particular string of code used for the pixel shaders and reordering that for optimal execution on their architecthure?!
Catching the string isn't a problem. Detecting 3DMark in order to do it is. Why go to so much effort to detect 3DMark (splash screen. string, shaders to name a few) if your optimizations don't rely on it and only need shader strings to work?

No, he didn't. He didn't do anything remotely close to that.
Yes he did. If you can't read and/or comprehend that's not my problem.

Do you realize how you sound? nVidia is obviously cheating because trivial changes impact their score but the trivial changes can't impact the score because they cheat so well.
The trivial changes don't change the other vendors but sucker-punch nVidia. Out of that you arrived at the conclusion that FutureMark is cheating and also that the other vendors are too because some of them don't have the physical ability to render what is requested. Of course nVidia is yet again blameless in the whole thing.

Only a true nVidia zealot can come to such a conclusion using said evidence.

Can you at the very least pick one position for your conspiracy theories per post?
How about you stick to one story instead of constantly changing it for every anti-nVidia piece of evidence that is presented to you? Oh sorry, you do stick to one stance: any such evidence is automatically invalid.

Almost everyone uses custom benches now,
And there's a good reason too. Don't try to hide the issue by posting Captain Obvious comments. You know darned well why the practice was started and is now widespread.

I'm not seeing your point with this. Particularly considering we are discussing 3DM2K3 here, stick to the subject and see the gaping holes in your conspiracy theory.
Because inevitably you will go on to claim that nVidia only cheated in 3DMark and there's no evidence to support it outside of that application.

Is XGI cheating or not?
Yes.

You have already quoted FM saying that S3 didn't render the test properly,
Because they can't.
However XGI won't.

Big difference my friend.

Is requesting nVidia to perform 16x AF and they don't do it cheating on their part?
Now what about 8x AF?

There's a difference between those two, approximately the size of the grand canyon.

I noticed you failed to respond to the compiler points
What compiler points might those be?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Feeling guilty you just blew months of BFG's defense out of the water with a post that was supposed to bolster his position? :p :D
Except he did no such thing.

Just to get it to work? hmm seems like a lot of new games come out and ATI is releasing hotfixes for them :)
As opposed to waiting for nVidia's next driver release before seeing a fix for them?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG-

Didn't you just state that they had driver issues that were cured by application detection? So what about the next batch of games that hadn't been detected by the drivers yet? And the next ones? And the ones after that? Or do think Quake III and UT detections automatically help Halo and Splinter Cell?

Yes, I never said their drivers were class leading.

Catching the string isn't a problem. Detecting 3DMark in order to do it is. Why go to so much effort to detect 3DMark (splash screen. string, shaders to name a few) if your optimizations don't rely on it and only need shader strings to work?

I'm waiting for any evidence of this in current practice.

Yes he did. If you can't read and/or comprehend that's not my problem.

If that is your interpetation then you have to admit that ATi is cheating in applications and hiding it. He stated they would reenable optimizations, that is the extent of it.

And there's a good reason too. Don't try to hide the issue by posting Captain Obvious comments. You know darned well why the practice was started and is now widespread.

Two sites showed screwed up bench results so that means that someone was obviously cheating to you. Do you have anything resembling evidence?

Because inevitably you will go on to claim that nVidia only cheated in 3DMark and there's no evidence to support it outside of that application.

What do you mean inevitably? I've told you numerous times, I don't say people are guilty without some evidence.

Because they can't.
However XGI won't.

Are you certain S3 can't?

What compiler points might those be?

The fact that what FM did falls completely in line with breaking compiler optimizations but doesn't jibe with a bunch of different hard coded shader replacement. I'm waiting for you to fly off the handle about ATi cheating also now that according to your standard we have an admission that they cheat in numerous games and hide it.

Except he did no such thing.

Then start going with the rants on ATi cheating. It's either that or admit reality and state that application detection and optimizations are not equal to cheating.

Snowman-

the cloaking device in halo is screwed up on nvidia hardware

nV matches refrast for the game, ATi doesn't. This is GearBox's fault, but it is what is happening. I feel like a dirty cheat using the ATi setting. It may be that GearBox assumes that if you run ATi you have no gaming skills and need the help, but obviously that isn't always the case. At least I can use the anti cheat application I have to get around their screw up.

as for KotOR, bioware has always been crappy at supporting ati and it doesn't seem like that is going to change no matter what happens.

ATi takes a commanding lead in the gaming community and they will start to play nicer with them.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Now that ATi is in Xbox 2, it has no excuse for poor performance or disabled features in future games. Devs will focus on the ATi h/w inside the Xb2 just like they did with the NV2A in the original Xobx (si hueg lol).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Yes, I never said their drivers were class leading.
You asked me what supported my assertion and I answered. What your comment above has to do with said discussion I'll never know.

I'm waiting for any evidence of this in current practice.
What good did it do when you had evidence in the past? Did you change your tune about the issue then? I think not.

This is another one of your tactics: ignore current evidence while its current and then pretend like there was never any evidence to begin with when a few months pass.

He stated they would reenable optimizations, that is the extent of it.
He stated (among other things) that nVidia will continue to detect 3DMark and work around FutureMark's anti-detection practices. Incidently this provides the "current evidence" you asked for above. Of course it's going to make precisely zero difference to your pro-nVidia stance and in a few months' time I suspect you'll again be asking me for evidence.

As for these valid optimizations you claim nVidia is performing, it's interesting that nVidia require the constant detection and re-detection of 3DMark in order to continue to use them despite your insistence that all they look for are generic shader strings. Why on earth does one need to detect a splash screen in order to subsitute/shuffle shader instructions in a generic fashion?

Two sites showed screwed up bench results so that means that someone was obviously cheating to you.
What are you talking about? Who screwed what up?

By your own admission more than one site is using custom benchmarks. Please, don't pretend to be so ignorant that you don't know why. You know darned well their reasons for doing it and why it's also become widespread practice.

It follows exactly the same line as nVidia detecting "UT2003.exe" and changing their filtering mode accordingly. Oh that's right, you said that was a bug wasn't it? Or did you call it a hack? Oh wait, the third time you called it an application profile didn't you? Or has it now been long enough so that you now deem it never existed?

Are you certain S3 can't?
Ask FutureMark.

The fact that what FM did falls completely in line with breaking compiler optimizations but doesn't jibe with a bunch of different hard coded shader replacement.
nVidia was caught performing multiple shader replacements. Stop trying to change history by pretending it never happened. If these shader replacements are genuine optimizations then nVidia certainly don't need to know whether the 3DMark splash screen is present or not.

Then start going with the rants on ATi cheating.
Why?

It's either that or admit reality and state that application detection and optimizations are not equal to cheating.
Show me in Snowman's quote where ATi say that they're detecting Quake III or any other application.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker

the cloaking device in halo is screwed up on nvidia hardware

nV matches refrast for the game, ATi doesn't. This is GearBox's fault, but it is what is happening. I feel like a dirty cheat using the ATi setting. It may be that GearBox assumes that if you run ATi you have no gaming skills and need the help, but obviously that isn't always the case. At least I can use the anti cheat application I have to get around their screw up.

hu, where have you seen refrast shots? the cloaking device looks the same on ati's dx9 hardware as it does on the xbox, so i would hardly call that cheating or a screw up for that matter, it is very clearly intentional. but what is this anti-cheat appllication you are talking about?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
the cloaking device looks the same on ati's dx9 hardware as it does on the xbox

I've owned both a XBox and Halo for closing in on three years now, you'll have to try something else there. The cloaking device on ATi's hardware doesn't look remotely close to anything like the XBox.

Show me in Snowman's quote where ATi say that they're detecting Quake III or any other application.

App specific optimizations they clearly state. Either start bashing ATi for being cheaters, drop it, or be considered in the same class as Hellbinder.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BenSkywalker
the cloaking device looks the same on ati's dx9 hardware as it does on the xbox

I've owned both a XBox and Halo for closing in on three years now, you'll have to try something else there. The cloaking device on ATi's hardware doesn't look remotely close to anything like the XBox.

lol, i got an xbox with halo the christmas it came out as well so you'll have to forfit that argument. :D

seriously, something is wrong with your xbox, your drivers or your eyes becuase bungie obviously wanted it to look like this or they woudln't have put the pic on their site, and that is how it looks on both the xbox and ati's dx9 cards as well.


also again i ask; what is this "anti cheat application I have to get around their screw up" you mention?