Forceware 53.03 gives FM the finger

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Is
Originally posted by: SilverLock

Err....

Would you get rid of 3Dnow? SSE? SSE2?


I would get rid of that stuff, if it came at the expense of standard FPU performance, which is analogous to what's going on.

No you wouldn't.

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
This whole thread is chock full of (willfully or not) "misinformed mumbling." Do we need another thread debating this, particularly one so un-/mis-informed?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Pete
This whole thread is chock full of (willfully or not) "misinformed mumbling." Do we need another thread debating this, particularly one so un-/mis-informed?

Set everyone straight then, oh great one.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Insomniak
Originally posted by: SilverLock
Originally posted by: McArra
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it. Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.

Err....

Would you get rid of 3Dnow? SSE? SSE2?


Exactly what people don't seem to realize about this whole situation. Thanks for pointing that out.

The difference is, unlike the aforementioned optimizations nVidia's only offer performance increases in benchmarks, not in actual games. Thats one of FM's problems with what nVidia is doing. According to them if it only increases performance in the benchmark and not in real world situations its not valid.

 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: ZimZum
Originally posted by: Insomniak
Originally posted by: SilverLock
Originally posted by: McArra
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it. Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.

Err....

Would you get rid of 3Dnow? SSE? SSE2?


Exactly what people don't seem to realize about this whole situation. Thanks for pointing that out.

The difference is, unlike the aforementioned optimizations nVidia's only offer performance increases in benchmarks, not in actual games. Thats one of FM's problems with what nVidia is doing. According to them if it only increases performance in the benchmark and not in real world situations its not valid.

But it does... ask anybody with a GeForce FX... the 52.16 drivers provide better performance in every game than the 45.xx drivers.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I'm against optimizations as both ATI and Nvidia shouldn't use 'em to score more in 3dMark. If Nvidia thinks 3dMark is irrelevant they should stop making optimizations. The question is that not only 3dMark shows poor PS 2.0 performance as Shadermark 2.0, Tomb Raider, HL2... also show it.

Ya well HL2 isnt anywhere near done and is Tomb Raider even out yet? You cant really judge performance on beta anything. Remember the Athlon prototypes Tom reviewed in the spring of 99? They showed FPU performance worse than a Pentium 1.

Not to say every developer is taking long hours so Nvidia's new hard can play tha game smoothly. Even Doom 3 has had special development for Nvidia and has cut FP to 16bits precision.

Well from what I gather, Nvidia went to John and asked him what he would like on a GPU. They followed it pretty close so that is why john uses the special extensions and FP16. Not because he has to in order to get Doom3 to work on an FX. But because the card has almost everything he wanted for the engine he was writing. Nvidia has even admitted to building the FX to run Doom3. If ATI had the same extensions I am sure he would use them. I gathered this off the Beyond3d forums.

My opinion is that Nvidia is having a bad time, they've tried to make some kind of "glide" and have failed to make the hard perform as it should with standard DX9 code. I used to like Nvidia cards a lot (I love my NForce 2 400U MoBo), but they have made a bad step, which I'm sure they're going to correct in the next gen cards. Now ATI has the lead, with great performance and very improved drivers.

If you are referring to CG, CG can make code compatible with DX9 from my understanding. So it isnt like Glide where only 3dfx cards can run it. You should be able to use CG and have it work on an ATI card.

According to them if it only increases performance in the benchmark and not in real world situations its not valid.

Well in some of the games we saw a 70% increase in performance with an increased IQ from 54.xx to 52.xx drivers.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
The point is that Futuremark has asked the GPU manufacturers not to use application detection in 3DMark03. This is their call, and NVIDIA should not be trying to sidestep their restrictions. NVIDIA is refusing to go along because they believe this is unrealistic (although ATI seems to have no problem with this!) Whether it is a good idea or not is an entirely separate issue from whether or not NVIDIA should follow the rules that have been established.

First NVIDIA claimed that 3DMark03 was a bad benchmark because it didn't use realistic enough code (and then got caught "cheating" at it after disparaging it as inaccurate and meaningless). Now their problem is that FM won't let them hand-optimize for it (even in ways that could be used for real programs) -- although since a few real DX9.0 games have come out, I notice they've stopped trashing its numbers as unrealistic.

ATI put out a pretty amusing statement a few weeks back about the percentage of games that actually get hand-optimized shaders and the like written for them by the driver team. Basically, only the most popular games and benchmarks will get worked on by NVIDIA and ATI. FM wants 3DMark03 to be representative of the raw hardware/baseline driver performance of the cards, not of how much time the driver team spent working on custom shaders for it. If you don't think that's a good idea, or that it's representative of "real" performance from the cards, then just disregard all 3DMark03 results.

But it does... ask anybody with a GeForce FX... the 52.16 drivers provide better performance in every game than the 45.xx drivers.

Then shouldn't they provide better results in 3DMark03, even without app-specific optimizations? If not, this means all NVIDIA has been doing for the last 6 months is application-specific enhancements rather than making their drivers faster in general.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Matthias99
But it does... ask anybody with a GeForce FX... the 52.16 drivers provide better performance in every game than the 45.xx drivers.

Then shouldn't they provide better results in 3DMark03, even without app-specific optimizations? If not, this means all NVIDIA has been doing for the last 6 months is application-specific enhancements rather than making their drivers faster in general.

Yes... it SHOULD... but FM has a bug in their ass for some reason. I got better performance in every game I own by switching from 45.xx to 52.xx... but 3DMark2003 scores don't reflect that when FM creates a patch to disable nVidia's "application specific optimizations."
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
I?ll take a shot at this.

The 52.16 drivers do contain some genuine upgrades in performance because of the unified compiler technology -- and I think 3Dmark3 reflects this to a certain degree. But Nvidia is also hand-coding replacing shaders in 3Dmark2003 and other games to improve performance. This hand-coding is not a realistic strategy for increasing performance beyond a few games because it is so labor intensive.

All Futuremark did was reorder the shaders slightly, ATI has a ??unified compiler??technology too and it was able to cope with the reordering of instructions so ? ATI?s scores remained the same. NV?s scores dropped because the ?compiler? wasn?t just coping with a simple reordering of instructions they were replacing shaders and a simple reordering of them broke this optimization.

This hand-coded shader replacement can also present a problem if a game is updated or new levels for the game are released -- the hand-coded shaders may not work anymore and you loose the performance gains of the hand-coded shaders.
.
beyond3d

A new PC game gets released about once a day; about one-third of these are games that really push 3D graphics. Only a tiny percentage of these will receive the dubious "optimizations" that have been directed at previous versions of 3DMark03. Gamers don't want to be locked into these. They don't want to be surprised by poor frame rates when they buy a game outside the top 10. Gamers need uncompromised benchmarks that give them a true picture of performance, so they can find a card that performs for all games, not just the half-dozen for which the drivers are faking it.

 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
In the same respect, Homeworld 2 players will be disappointed when their 9800 Pro doesn't perform the way it does in 3DMark2003.

Solution to the problem... Industry Standard Benchmark out... game benchmarks in.
If FM wants to stay in business, they should get a crapload of demos of the popular games, and measure performance with those instead of running "games" that nobody will ever play.
 

Naffer

Member
Oct 21, 2003
28
0
0
Yea! Shame on Intel! SSE2 may perform FPU calculations 3x as fast but thats only because it sacrifices a tiny bit of floating point precision.

Ugh. Don't you see what Nvidia is doing? Any benchmark where scores change on a daily basis without a change in image quality is not a legitimate benchmark. HardOCP pointed that out a few days ago. Nvidia is doing it's best to cast doubt on FM's program. It's worked hasn't it? The next big cards that come out won't be able to woe anyone with just a big 3dmark score, because Nvidia has proven that FM is irrelevant.

I don't care how my video card does in some silly synthetic benchmark. If Nvidia wants to use a shader recompiler thats up to them. The FX series is hampered by a lack of physical registers, so code that uses an unnessesary amount of registers is inefficiant on the Nvidia cards.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
In the same respect, Homeworld 2 players will be disappointed when their 9800 Pro doesn't perform the way it does in 3DMark2003.
Well, maybe NV is using shader replacement (in Homeworld) and their performance is too high.: )

3Dmark3 is only a general DX9 guideline. But Futuremark just didn?t pile together a bunch of nonsensical code with no direction or goal in mind and say here?s a benchmark. The engineers at Futuremark, I gather, asked the Q: ? what type of code and things are DX9 software going to be required to run? They also sat down and consulted with many groups in the software/gaming industry about this exact question too. Based on this, they tried (however successfully) to set up a general performance guideline/benchmark.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: ZimZum
The difference is, unlike the aforementioned optimizations nVidia's only offer performance increases in benchmarks, not in actual games. Thats one of FM's problems with what nVidia is doing. According to them if it only increases performance in the benchmark and not in real world situations its not valid.

No sir....DirectX 9 games took a HUGE boost when Nvidia added their compiler in.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Naffer

I don't care how my video card does in some silly synthetic benchmark. If Nvidia wants to use a shader recompiler thats up to them. The FX series is hampered by a lack of physical registers, so code that uses an unnessesary amount of registers is inefficiant on the Nvidia cards.

Agreed. That's pretty much the thing holding the architecture back....

There were two huge mistakes Nvidia initially made with the GeForce FX line:

1) 128 Bit memory interface, corrected as of NV35
2) About half as many PS2.0 registers available as the R3xx architecture

I'd bet dollars to pesos that NV40 is chock full of PS2.0 registers and pixel pipes. I'd be heartily suprised if Nvidia hasn't learned a few things here.

 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: Matthias99
The point is that Futuremark has asked the GPU manufacturers not to use application detection in 3DMark03. This is their call, and NVIDIA should not be trying to sidestep their restrictions. NVIDIA is refusing to go along because they believe this is unrealistic (although ATI seems to have no problem with this!) Whether it is a good idea or not is an entirely separate issue from whether or not NVIDIA should follow the rules that have been established.

First NVIDIA claimed that 3DMark03 was a bad benchmark because it didn't use realistic enough code (and then got caught "cheating" at it after disparaging it as inaccurate and meaningless). Now their problem is that FM won't let them hand-optimize for it (even in ways that could be used for real programs) -- although since a few real DX9.0 games have come out, I notice they've stopped trashing its numbers as unrealistic.

ATI put out a pretty amusing statement a few weeks back about the percentage of games that actually get hand-optimized shaders and the like written for them by the driver team. Basically, only the most popular games and benchmarks will get worked on by NVIDIA and ATI. FM wants 3DMark03 to be representative of the raw hardware/baseline driver performance of the cards, not of how much time the driver team spent working on custom shaders for it. If you don't think that's a good idea, or that it's representative of "real" performance from the cards, then just disregard all 3DMark03 results.

But it does... ask anybody with a GeForce FX... the 52.16 drivers provide better performance in every game than the 45.xx drivers.

Then shouldn't they provide better results in 3DMark03, even without app-specific optimizations? If not, this means all NVIDIA has been doing for the last 6 months is application-specific enhancements rather than making their drivers faster in general.

Exatly. It's like racing a F1 with a turbo (for does who don't follow F1: turbo is disalowed). Then FIA comes and say there are some rules to run in F1 and Nvidia says: we have proven our cars are faster than competitors'. Case in wich FIA says: Ok, you're disqualified, go and put a turbo into a WRC car and make it the fastest if you want (turbo is alowed in WRC).

Is just that.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I find it very sad that people actually like nVidia cheating, lying to, and betraying cusomters. Go ahead and rejoice about nVidia once again cheating to decieve the general public. Too bad they dont put forth the energy into making good products that they do acting like babies.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Ackmed
I find it very sad that people actually like nVidia cheating, lying to, and betraying cusomters. Go ahead and rejoice about nVidia once again cheating to decieve the general public. Too bad they dont put forth the energy into making good products that they do acting like babies.

nVidia acting like babies?

<baby>nVidia cheats... waahhhhh, nVidia lies.... waaahhhhhh, nVidia betrayed it's customers... waaaaaahhhhhh</baby>
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Jeff7181
Originally posted by: Ackmed
I find it very sad that people actually like nVidia cheating, lying to, and betraying cusomters. Go ahead and rejoice about nVidia once again cheating to decieve the general public. Too bad they dont put forth the energy into making good products that they do acting like babies.

nVidia acting like babies?

<baby>nVidia cheats... waahhhhh, nVidia lies.... waaahhhhhh, nVidia betrayed it's customers... waaaaaahhhhhh</baby>


No joke. Fanboys love to cry don't they?
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
They are only trying to fool the complete dumbassess of the world.


are you a dumbass?


so why are you crying?


/cheer @ Nv


/rude @ FM
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Some stunningly brilliant discourse here...

<baby>nVidia cheats... waahhhhh, nVidia lies.... waaahhhhhh, nVidia betrayed it's customers... waaaaaahhhhhh</baby>
No joke. Fanboys love to cry don't they?
They are only trying to fool the complete dumbassess of the world.
are you a dumbass?

*Any* company lying to make their products look better than they really are is a Bad Thing. It's called deceptive advertising, and in most industries this gets you in trouble with the FTC. Since NVIDIA doesn't specifically tout its 3DMark scores as a selling point of its cards (there are plenty of enthusiast sites to do that for them), it can get away with stuff like this.

Futuremark is asking *everyone* (not just NVIDIA) to not use application detection in 3DMark03. NVIDIA is refusing to go along. This is a Bad Thing, as the whole point of a benchmark is that it should be a level playing field -- and in this case, FM has decided they want to measure the baseline driver performance, not how it looks after they've replaced all the shaders with hand-tuned versions. If you don't like what Futuremark is doing, fine! Go talk about them and their unrealistic benchmarks and how they're (supposedly) out to get NVIDIA. Don't hold up NVIDIA as a shining beacon of virtue here, because what they're doing is, at best, verging on deceptive. I would have more respect for them if they stopped trying to do these optimizations -- you can't tell me the time wouldn't be better spent fixing driver bugs and improving performance in real games -- and just went back to their old stance of "3DMark03 isn't representative of real DX9-class games."

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Insomniak
Originally posted by: Jeff7181
Originally posted by: Ackmed
I find it very sad that people actually like nVidia cheating, lying to, and betraying cusomters. Go ahead and rejoice about nVidia once again cheating to decieve the general public. Too bad they dont put forth the energy into making good products that they do acting like babies.

nVidia acting like babies?

<baby>nVidia cheats... waahhhhh, nVidia lies.... waaahhhhhh, nVidia betrayed it's customers... waaaaaahhhhhh</baby>


No joke. Fanboys love to cry don't they?

Another thread dissolves into name calling

:disgust:



 

vshah

Lifer
Sep 20, 2003
19,003
24
81
Nvidia created CG because when they started working on their dx9 parts, there was no set standard for dx9, i.e. microsoft HLSL did not exist, and they needed a programming interface/language for their gpus. ATI also had their own compiler/language etc. when the dx9 spec was finalized by microsoft, it ended up closer to ATI's implementation.


also, Futuremark is retarded.

long live my Geforce 2 Ultra


-Vivan