• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Anand's 9800XT and FX5950 review, part 2

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Rollo
You want to know the best way to decide about IQ, regardless of settings? Own both cards and run your own tests, on the same exact machine
I've done that Ronin. I thought my 5800 IQ was great, I think the 9800Pro IQ is a little better. Not enough difference to care about in gameplay.

And you are totally entitled to that choice, and opinion.

Me, I have a mixture of both cards (5900/5800/9800/9700) in my machines. Truthfully, I have a mixture of everything from a Radeon 7200 to a 9800 Pro, and from a GF2 MX400 to a 5900 Ultra. What's in my main box right now? I'm swapping between the 9800XT and the 5950 for testing purposes. :)
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: RoninCS
Originally posted by: Rollo
You want to know the best way to decide about IQ, regardless of settings? Own both cards and run your own tests, on the same exact machine
I've done that Ronin. I thought my 5800 IQ was great, I think the 9800Pro IQ is a little better. Not enough difference to care about in gameplay.

And you are totally entitled to that choice, and opinion.

Me, I have a mixture of both cards (5900/5800/9800/9700) in my machines. Truthfully, I have a mixture of everything from a Radeon 7200 to a 9800 Pro, and from a GF2 MX400 to a 5900 Ultra. What's in my main box right now? I'm swapping between the 9800XT and the 5950 for testing purposes. :)

Choice and opinions are good, but they are not the essence of good testing. If one card implements 4XAA and the other card does not, then the card which does not gets its score inflated. This is what bothers me. In the current article all they did was set the cards to the same setting (ignoring the actual ouput) and run the tests. If you want to test 4XAA you better make sure that the card you are testing is actually doing it or the test results are worthless.

 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Originally posted by: Rollo
You want to know the best way to decide about IQ, regardless of settings? Own both cards and run your own tests, on the same exact machine
I've done that Ronin. I thought my 5800 IQ was great, I think the 9800Pro IQ is a little better. Not enough difference to care about in gameplay.
except when you start to notice that minor things are missing from that gameplay ;)

like rednering distance is shorter then it used to be, and some shadows dont get rendered in certain games, or in certain levels.

if you run the benchmarks comparing >quality vs performance vs high performance< youll oddly notice that in certain benchmarks there is no high performance level. it benchmarks the same as the performance level.

that tells me that actually there is no quality mode!
performance and high performance are actually both high performance quality and quality mode is actually performance mode.

run these same benchmarks on some obscure unknown OGL benchmark (X-Isle Tech Demo) and whalah! all 3 quality levels exist.

NV AA is for chit. only noticable difference is in 8x mode, which is unplayable
rolleye.gif

NV AF isnt bad but its setting does not = actual rendering.

>these examples are from personal experience, not from a review.

:D
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
I'm sick of all you damn child-minded fools who ignore all facts. How about you use your brain and eyes for once, instead of dismissing any little thing that isn't convenient for you, hmm?
LOL- I see I struck a nerve with you. Well, I'm sick of guys like you who just have to be right, have to put other people down over things as meaningless as a video card choice, and have to pimp ATI like your next paycheck depends on it.

What do you mean "just have to be right?" I asked a valid question. You balked, then upon being prodded by oldfart you admitted there was a difference. Way to have your cake and eat it too, Rollo.

This is a really mature, objective thought process you have going on Rollo. Nvidia has been caught cheating over and over again with their 44.xx and up drivers, so let's just sweep everything under the rug, dismiss all possible cheats as fanboyism, shrug off all accusations, and shut off our brains altogether.
Actually it is more mature and objective than yours- I don't assume that nVidia's new drivers will cheat at image quality because others have, that would be illogical. [/Q[/quote]

I didn't assume anything. I looked at the pictures, and checked out discussion at Beyond3D about it.

I really don't care if they do cheat if the rendered image is the same, which Anand says it is. He's seen both side by side full screen, have you Jiffy? Or does it just make you uncomfortable he's contradicting your anti -nVidia flaming? I'm as objective as it gets, I like both nVidia and ATI cards, purchase both, and am glad for the differences.

Anti Nvidia flaming? This is why I hate these discussions. Where did I fanboyishly flame anti Nvidia? I said the image quality looks blurry - it does. I said the AA doesn't look right. It still doesn't. You hear whatever you want to hear Rollo.

Now Anand is recommending us to jump off a bridge. You first, Rollo .
I'd rather jump off a bridge than be reduced to a little video card pimp. LOL the point was if the guy looking at them full screen side by side can't tell the difference in IQ, it must not be big enough to care about.

Go for it big shot.

Originally posted by: Rollo I'm as objective as it gets, I like both nVidia and ATI cards, purchase both, and am glad for the differences.

That is the funniest thing I've ever heard. Every single post you've made here in Video is about how Nvidia doesn't cheat, has never cheated, blah blah. In your own world, owning a Radeon 9800 gives you a magical shield against being biased against ATI. Just like how you call Gabe Newell fat, say he ate your "sack of sugar" etc but you're exempted from that because you gained some weight after college.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
just to add some fuel to the fire...

oldfart is gonna like this ;)

just out of curiosity i decided to scrutenize the screenshots posted for AM3.
im using the Omega 45.23 Quality drivers, (FX5900) and my AM3 @ frame 4000 screenshot looks much different then what AT posted.


AT NV 45.23

AT NV 52.14

AT ATI 3.7

My NV Omega 45.23



obviously i know how to use an image processor in order to crop my screenshot exactly the same as AT did, but i did not make any other changes to the pic in any other way.

this post in no way is to say AT did something wrong. im only trying to point out that there is definately something weird with NV drivers that my Omega drivers look so much different.

:D
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Originally posted by: shady06
maybe its the wackey tabbaci but i dont see this difference in the AQ3 screenies

take a closer look at the background in my pic.
actually my pic is the only pic that has a background ;)
the rest are all dark and basically black.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
hey :|
i just realized that AT used very low quality jpgs for those shots :Q
AT's jpgs are only 39kb, and mine is 63kb using a typical 5% compression.

shame on you AT :p


(see im a equal opertunity flamer) ;)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
oldfart-

On Driver size:
nVidia:
English: File Size: 8.5 MB
International: File Size: 18.9 MB

ATi:
25.3 Meg.
Supports 24 languages. The driver size "exploded" when they went to an international version instead of separate downloads.

The super cheating exploded file size 52.13s are only 5MBs, a reduction of 41% versus the prior version. Doesn't really jibe with the rhetoric does it?

Ben, Since you were good enough to mention TRoD bugs in Anands review, whats going on with the bugs mentioned in F1 Challenge, GunMetal, Homeworld 2, and X2??

I thought nVidia didn't have driver bugs??

Because those are all cheats, but we already know that nVidia cheats in everything. They even cheat booting in to Windows, just ask the people in these forums. You can't really even play any games with nVidia boards at all, all they do is show cheats.... :)

Seriously though, any image glitch in a game from nV now is considered a cheat instead of the possibility of it being a bug. nV's drivers certainly have some bugs, haven't run in to ones like this though. It has gotten quite absurd. nVidia needed to have compilers that were going to take a decent amount of time to complete before they would show close to their potential. This has been known for some time. nV gets caught with cheats in 3DM2K3 and all of a sudden every glitch in every game is a cheat to improve performance. It comes across as comical how many times these glitches have been fixed without a performance hit, but it is sad to see some people so thick skulled to completely ignore how they were wrong and still think that they are correct about any other game due to the fact that nV got nailed cheating once. ATi was also nailed cheating in the same bench, just not nearly as badly. If anyone can honestly make the automatic assumption that nVidia is cheating every time they have an image glitch, then they must also assume the same for ATi.

The line about nV replacing shader code in their drivers is true, but if you consider how they are doing it then ATi is doing the exact same thing. We have heard numerous times about the huge disparity in performance on HL2(while people ignore what was formerly the most hyped game engine on these boards, that is until they found out ATi wasn't up to the performance level of nV) but we now have Valve admitting that they were not making an effort to make it close to fair. Actually, by compiling the mixed mode path for nV and not the DX9 path they went out of their way to make nVidia's parts look considerably worse then any honest assesment could make them. This rigged up test is continually being used as an example of how much faster ATi's parts are then nV's in shader intensive benches. Halo also used to be in with HL2. What happens when the game is released and people see the actual performance? All of a sudden Halo is less of a game to many of the ATi faithful. In that same vein, there are users on this board who have been avid id fans for many years and haven't even played through HL that are talking sh!t about Doom3 and building HL2 up to be the greatest thing since sliced bread. I've said it before and I'll say it again, I have every faith that HL2 is going to be the better game, I'm also quite confident at this point that the Doom3 engine is going to be better. I know your stance on HL2, but that is in no way a departure from what you have felt for many years now, having nothing at all to do with how it performs on a given card. Too many people on this forum are figuring out what games they like and don't like based on how fast they run on given hardware. To me, that is the best/saddest example of rabid fanboi syndrome I've seen yet.

BTW- On the AA/AF shots you posted something is certainly off, ATi is showing quite a bit of AA for the textures in that screenshot. Very interesting as they don't have a SSAA mode supported wouldn't you say?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Thugsrook:
NV AA is for chit. only noticable difference is in 8x mode, which is unplayable
I have turned the 4X off and on with a nV30, gee, there was a difference. If you can't see the difference with your 5900, I guess there's no point in discussing it. I would be interested to see links to review sites that state "nVidia AA does nothing" however, as you can't see the difference, I'm sure this has been widely reported.

BTW- I get some flashing cave walls in the "Magma" level of UT2003 with my 9800Pro. Oh no- could it be driver cheats?
rolleye.gif
I agree with Ben, every time an ATIdiot sees an abnormality with a game, it's on to the bbs,"Yapyapyap! Cheating! Evil nVidia!"

Jiffy:
In your own world, owning a Radeon 9800 gives you a magical shield against being biased against ATI
You're obtuse. You confuse "not being biased against nVidia" with "being biased against ATI". I'd defend ATI too if people were making ridiculous claims about them, let's say, being as stupid as to say they suck because they don't have as good of performance on one future API that doesn't matter at all these days.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Ben drivers file size rhetoric is yours. Kind of grasping a bit aren't you??
Because those are all cheats, but we already know that nVidia cheats in everything
I called them bugs. You call them cheats. Whatever. Point is nVidia is not perfect. You are good at pointing out every ATi bug in excruciating detail, while totally ignoring or justifying any nVidia flaw. My point had nothing to do with cheating drivers. Where is the word "cheat" in my post? Just wanted to point out nVidia has bugs like anyone else. It just depends if you choose to ignore them or not.

Your comment about the AA is a perfect example. Somehow, nVidia shots with 4 x AA looking the same as no AA, when ATi's 4 x AA looks smooth is an ATi problem?

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Ben drivers file size rhetoric is yours. Kind of grasping a bit aren't you??

That would be a good point except shader replacement code takes up file space, I'm not the one running on and on about that like a fool. I did make the leap that driver replacement code would be built in to the drivers for every benched game to the fact that that means larger driver file sizes, but I don't consider that too much of a leap for people to have a hard time following.

I called them bugs. You call them cheats. Whatever.

Emoticon is there and everything......

You are good at pointing out every ATi bug in excruciating detail

If you would like me to start exhibiting that trait I can. I didn't comment much on the Quack issue, didn't comment much on the 3DM2K3 issue and I'm still limiting my comments on this. The reason I kept my mouth shut on Quack is that those pushing the rhetoric the hardest ended up looking like fools. The archives go back to the Quack days, anyone can check the validity of my claims. That didn't hard lock systems or make them reboot, so it wasn't nearly as big of an issue as those bugs that I do like to point out and I will continue to make a big deal of if they effect me.

Just wanted to point out nVidia has bugs like anyone else. It just depends if you choose to ignore them or not.

And my major issue has always been with bugs that hard lock systems or remain in place for a year without a fix, things like that.

Somehow, nVidia shots with 4 x AA looking the same as no AA, when ATi's 4 x AA looks smooth is an ATi problem?

I pointed out that the screenshot comparison in terms of AA is invalid and very clearly so. The ATi board is displaying something it can't do, obviously something is wrong with the screenshots. As far as the nV shot having AA- the picture they have displayed shows some AA(although can't rule out that that is due to the screenshots being screwed up) it simply does not work as well as ATi's nor has it ever(in terms of edge AA, a point I have not argued).

Edit- BTW- In case you misread my last post, the majority of it wasn't directed at you. If you read the behaviour description I think it's fairly obvious you haven't been displaying the traits I was ranting about(even went out of my way to point that out about HL2).
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Ben, In this post and others (the level of compression in AQ3) you seem to be implying that Anandtech has doctored the screenshots, or am I reading you wrong? If so, that is a fairly serious accusation. If it is true, I would like the reviewers to answer it. If I'm reading you wrong, my apologies. Please explain how you like the screenshots got screwed up if not intentionally.

You will go on about that Half-Life bug till eternity. :)

Its also interesting how you also have pointed out all the rendering imperfections on ATi cards though the years (most of which no one else complained about) and have nothing to say about nVidia not rendering fog, lighting, shadows in games. If you are going to criticize image quality, please do so objectively instead of doing it one sided. How many years will you put nVidia in the penalty box for these errors? Not as many as ATi I'm sure.

As far as the cheating cr@p goes, I think any mfgr should be ashamed to have done it. This includes ATi (Quack) or the current nVidia situation which seems to be improving withe the 52.xx series drivers.

All of this has actually been good in a way. It has forced the review sites to focus on IQ instead of just running a benchmark and posting the results with not even a mention of IQ. People are much more cognizant of image quality as well as FPS now. It has also resulted in a broader suite of game tests instead of the "Top 3" well known benches that can have specific optimizations.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Ben, In this post and others (the level of compression in AQ3) you seem to be implying that Anandtech has doctored the screenshots, or am I reading you wrong?

Doctored no, screwed up yes. Using a different level of compression on JPEGs can seriously impact the quality of the image, and for the screenshots you posted something is screwed up. ATi does not support SSAA, but that is what that screenshot is showing in terms of the textures. Possible ways this could have happened? If the image was accidentally resized(scaled up) and then fixed it would have the same effect. I would certainly be interested in hearing comments from the reviewers, but they don't answer me(Evan is the only one I can think of that ever has). They have made numerous screw ups in the past, and due to their ability to screw up for and against the different companies in the same review I don't see it as doctoring of anything, just mistakes. Best way to handle screenshots if you have the bandwith is to show the JPEGs and then allow users to download the original bmps or at least pngs so they can better see for themselves the exact comparison. ATi and nV are quite different in their gamma settings, it is very easy to try your @ss off to show users what the end image looked like and screw it up in the process. There are a slew of ways that you can hose screenshots with the best of intentions. In reality if you simply take screenshots and post them exactly how they are you will mislead people considerably in numerous games(you still have a nV board right? Take a Q3 screenshot using the in game method, F11 IIRC, and one with PrintScreen in the exact same spot, without moving back to back and compare them, huge difference, the in game method gives you what you see on screen, the PrintScreen you have trouble making out).

You will go on about that Half-Life bug till eternity. :)

Hehe, d@mn straight ;)

Seriously, the thing that bothered me the most about that situation was that there was noone saying that any of these bugs existed. I read post after post about how ATi's drivers were incredible, better then nVidia's. I spend a couple hundred and I find out that they were way the hell off the mark, and even then numerous people attempted to deny the bugs existed after ATi said they did. Many of these same people are the ones jumping over anything nV does with a cheater label.

Its also interesting how you also have pointed out all the rendering imperfections on ATi cards though the years (most of which no one else complained about)

I have done that a couple times, mainly when I hear about how 'perfect' ATi's drivers are.

and have nothing to say about nVidia not rendering fog, lighting, shadows in games.

Can someone point me to them? I have a rather extensive collection of games here, and I end up being told that they don't show up in any of the titles I have. Combine this with the raving about ATi's drivers and the less then objective assesment of nV by these same people and, you get the picture ;)

I was planning on picking up a 5900FX until I heard about all the driver issues. Because of them I haven't(which I have stated previously, although I don't recall in which thread). In the long run I think its better as now the big 2 have been delayed and it looks like NV40/R390 will be here before then, and every game I'm playing is going quite nicely(well, I could stand a higher res in Halo). Driver issues or potential driver issues are a big deal to me, I don't try and deny that nV has any problems with their drivers, that is a big difference between myself and others. I do ask for examples of them from time to time, and I've even looked for some of the games they show up in for the sole purpose of seeing if I run in to them(in the bargain bin, not going to waste $50 to see a bug :p IGI was one of them IIRC).

If you are going to criticize image quality, please do so objectively instead of doing it one sided.

I have stated numerous times that ATi has clearly superior edge AA to nVidia, what else am I supposed to comment on? nV has superior texture filtering, ATi has superior AA. When you calibrate them, they both have extremely close color calibration. ATi is a bit more agressive with their LOD(although that is adjustable).

In debate terms, first you have to have something to debate about. You can talk about the bugs in nVidia's drivers so we can talk about the finer points on the IQ level, but how am I supposed to so much as discuss IQ with people that say and glitch is a clear cheat? You can't have a reasonable discussion if the participants aren't able to agree on basics.

How many years will you put nVidia in the penalty box for these errors? Not as many as ATi I'm sure.

If they have bugs that hard lock my system or proves to be otherwise defective then the penalty is five years(dead serious- by that I mean I won't spend another cent on their products for myself if they will see any of it). Creative Labs is in my box now, as is Abit, Sony, RCA, Philips, Eidos and Bethesda off the top of my head.

Edit-

D@mnit, you edited your post while I was typing mine :)

As far as the cheating cr@p goes, I think any mfgr should be ashamed to have done it. This includes ATi (Quack) or the current nVidia situation which seems to be improving withe the 52.xx series drivers.

I agree. For nV, I think the overwhelming majority of 'cheats' were bugs. Compilers are significantly more complex then typical drivers.

All of this has actually been good in a way. It has forced the review sites to focus on IQ instead of just running a benchmark and posting the results with not even a mention of IQ. People are much more cognizant of image quality as well as FPS now. It has also resulted in a broader suite of game tests instead of the "Top 3" well known benches that can have specific optimizations.

Absolutely agree. I think that review sites need to impove on their methodology for IQ comparisons considerably, although it is obviously much better then it used to be. I also am very pleased to see expanded benches being used, even if I take some issue with how they do it sometimes(minor quibble, why did the use the light cycle section of Tron for a bench, the game is a FPS, the LC is a mini game). If each site pickes fifteen to twenty games, and they only have five of them shared between them(you obviously still need the big3) then everyone will be forced to optimize their general drivers moreso then for a particular set of games.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
cant you guys talk about this without starting a war?

dispite all the flaming going on in this thread, ive learned alot from the AT review and from some of the comments made by some of the ppl in this thread.

:D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I said the majority.
The majority are cheats.

Compiler bugs don't cause reduced/ignored AA/AF requests.
Compiler bugs don't cause static clip planes that magically pop up in certain places for certain benchmark runs.
Compiler bugs don't cause Z clearing to be ignored during certain benchmark runs.
Compiler bugs don't magically substitute entire games' shaders with their own versions.
Compiler bugs don't cause screenshots to look better than what is actually on the screen.

It's not much different from Quack.
Actually it is different as there's compelling evidence to call Quack a genuine bug, both from ATi developers and ATi beta testers. Are you telling me that any of the above nVidia problems can be justifed as driver bugs?

I have seen nothing that backs this up.
Then perhap's you should read Gabe's presentation.

You think Valve is a good example of an evenhanded company?
Why would Valve make something like that up? What possible benefit would they get out of publicly antagonising nVidia like that? If anything they're cutting away their own sales as nVidia card owners might avoid the game.

Also no offense but I'd be more inclined to believe what an actual developer like Gabe says more than I would what you say. You certainly know your stuff but you also have an incredible nVidia bias to the point where they absolutely can't do anything wrong. nVidia's cards could explode and cause fiery deaths and you'd still find some way to justify it.

With all Valve's talk about how much extra time they had to spend on coming up with a custom nV code path they couldn't even bother to compile the default shaders with MS's compiler that would improve performance.
My understanding was that Valve used nVidia's CG compiler since the FX series ran even more poorly with MS's default compiler. Also correct me if I'm wrong but there's supposed to be another version of MS's compiler coming out to work around the FX's limitations a bit better.

Also in his very presentation Gabe asked disbelievers to ask Microsoft whether Valve cooked the numbers and/or failed to optimise correctly for nVidia hardware so that suggests to me that Valve have been working with Microsoft on the issue. Or are you now going to claim that Microsoft doesn't know how to optimise DirectX code either?

They were in 3DMark2K3.
Where? Show me your evidence.

The same guy who implies he's bending over backwards spending a significant amount of time to optimize for nVidia and can't even bother to use MS's compiler.
Again, my understanding as to why they didn't use MS's compiler was because it was even slower than nVidia's CG compiler.

The Dets have been reduced in size
Of course they have. It's a result of the compression they use to stop anti-cheat programs from working. Another "compiler bug", eh?
rolleye.gif


As far as automatic shader replacement goes, every one of the vendors does this and they have to.
I'm not taking about conversion, I'm talking about what is fed to the compiler to begin with.

The 3DMark03 issue proved among other things that nVidia had completely substituted one of their own custom-made water shaders to simply raise 3DMark performance. The shader wasn't optimised, it was replaced with a version that ran much faster but didn't exactly look like the original. "Close enough" and nVidia hoped nobody would notice but unfortunately for them almost everyone did notice.

If you are creating your own hard coded shaders for a plethora of ones used in titles then you migh see something like 20MB driver files. That would be an obvious example of cheating, right?
If you can find some way of decompressing nVidia's drivers then you'll be able to see their actual size, not the worthless initial size that's reported when you first download it
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Originally posted by: THUGSROOK
i wonder,
did the NV drivers have "texture sharpening" enabled during those AM3 screenshots?
its been confirmed~
"texture sharpening" enables 2xAF for D3D applications.
it also overrides Aquamark3's call for 4xAF, and uses 2xAF insted.

rolleye.gif
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I thought texture sharpening just boosts the driver-forced AF level one notch. Thus, if you had AF set to "off/app. pref." in nV's drivers, but then enabled "texture sharpening," the end result is that you're enabling 2xAF via drivers, overriding the game's settings. This isn't exactly sneaky, but it is a horrible GUI design decision, IMO (akin to ATi now burying its AA and AF controls under another GUI layer in favor of the practically useless profile settings).
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Compiler bugs don't cause reduced/ignored AA/AF requests.

Try to enable AA on your R9700Pro running Halo. Is that a cheat? There are cases where AA may not work correctly due to driver bugs or game issues, why you think they are cheats is beyond me. As far as AF, I've never seen an ATi card with any driver that did it properly, why don't you point one out. If you take issues with optimizations, then you need to do so in an honest fashion and state that ATi does not and never has supported AF or you take the position that you don't like particular optimizations.

Compiler bugs don't magically substitute entire games' shaders with their own versions.

That is exactly what driver level compilers do. ATi's do as does nVidia's. The DX9 assembly level code is not machine level code, the driver level compiler must convert it over.

Compiler bugs don't cause screenshots to look better than what is actually on the screen.

I don't believe Gabe in any way shape or form on that. Not even close.

Are you telling me that any of the above nVidia problems can be justifed as driver bugs?

The static clip planes I've already said was a cheat, the rest absolutely. You realize that most of your ranting about nV cheats and changing shader code have vanished with the latest drivers and they are faster then before? How do you justify your implications that they were all cheats? If they IQ is back, and the performance is still higher, it's a safe bet that they were bugs as had been stated numerous times. That you can still attempt to go on about it is puzzling, how many times must your line of thought be proven wrong?

Why would Valve make something like that up? What possible benefit would they get out of publicly antagonising nVidia like that? If anything they're cutting away their own sales as nVidia card owners might avoid the game.

The quotes I pulled were from Valve. They rigged the test and admitted to it(although not in so many words). Why would they do something like that? It couldn't possibly have anything to do with several million dollars in above the boards promotional money I'm sure. It is public information that ATi paid them millions, but of course that was saintly cash....

My understanding was that Valve used nVidia's CG compiler since the FX series ran even more poorly with MS's default compiler. Also correct me if I'm wrong but there's supposed to be another version of MS's compiler coming out to work around the FX's limitations a bit better.

Check out the link I provided(it's on B3D's front page). The nV optimized compiler from Microsoft was available to Valve and they used it only for mixed mode. For DX9 mode they used the compiler that was optimized for ATi only despite obviously having the nV version, from Microsoft, available to them. The quote I pulled is from Valve, they admitted to it(with the source code leak it wouldn't take too much effort for someone to simply compile it the optimal way and show the improvement).

Also in his very presentation Gabe asked disbelievers to ask Microsoft whether Valve cooked the numbers and/or failed to optimise correctly for nVidia hardware so that suggests to me that Valve have been working with Microsoft on the issue. Or are you now going to claim that Microsoft doesn't know how to optimise DirectX code either?

Valve has admitted what they did. With the source code out, they couldn't be expected to hide it any longer.

Where? Show me your evidence.

Use the 2.7Cats and run the Nature test, pay close attention to the leaves. Rerun the bench with newer drivers. This is a bit unfair, you have to rely on third hand speculation of people with an agenda while I got to test first hand.

Of course they have. It's a result of the compression they use to stop anti-cheat programs from working. Another "compiler bug", eh?

The driver size is reduced v the earlier versions that had the anti optimizations. According to your 'second gunman on the grassy knoll' this wouldn't be viable what with nVidia recoding most of the games on the market.

The 3DMark03 issue proved among other things that nVidia had completely substituted one of their own custom-made water shaders to simply raise 3DMark performance. The shader wasn't optimised, it was replaced with a version that ran much faster but didn't exactly look like the original. "Close enough" and nVidia hoped nobody would notice but unfortunately for them almost everyone did notice.

And ATi replaced the leave shader. They hoped their loyalists wouldn't notice or remain quiet and they did for the most part.

If you can find some way of decompressing nVidia's drivers then you'll be able to see their actual size, not the worthless initial size that's reported when you first download it

Do you realize how you sound? nVidia is in the midst of some giant conspiracy to you. Apply Occam's razor to your logic and see what you think.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
The quotes I pulled were from Valve. They rigged the test and admitted to it(although not in so many words)

Valve has admitted what they did. With the source code out, they couldn't be expected to hide it any longer

Link?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Were the Shaders in the benchmark compiled with the latest version of HLSL (i.e. the version that orders the assembly for a more efficient use of the FX's register limitations)?

[Brian Jacobson] The ones used by the mixed mode path were.

Link. They had the compilers to use for mixed mode, they could have used them for DX9 if they wanted to present the fairest comparison they could have, but they didn't and they didn't. I'm not talking about rewriting code, reducing precission or anything like that, I'm talking about simply compiling the code using MS's own compiler for both architectures instead on only one. With all the crap Newell was saying about all the extra work they went through for nV they couldn't even bother to compile the code they already had with the compiler they already had? Using simple logic, what are the odds that Valve didn't compile the code for the best performance and see the results beforehand, then deciding what to implement for the public benches? They spent months recoding for nV and couldn't bother to compile code they had already completed? The logic is missing there to say the least :)