Maybe a little cheating going on with nVidia's 40.41's

SilverBack

Golden Member
Oct 10, 1999
1,622
0
0
I observed something today that I thought was interesting.
I tried to let Kyle Bennet of HardOcp see what I had found, he wasn't interested. Maybe the readers here can help.


If you have an XP set up and a nVidia card see if you can validate my findings....

What I wrote to Kyle.

Hi,
I've been reading HardOcp for quite a while now and when I discovered what I want to show you, came to mind immediately.
I would want you to verify this, but I'm willing to bet that you can.

I'm running WinXP.
I have Multires from Entech installed to take care of issues with refresh rates in games. I hate the 60Hz issue as it's very hard on the eyes.
I have a fresh install on my box also of WinXP.

I ran 3D Mark 2001 SE as a benchmark. I had already installed Multires and locked my rates. I noticed that the screen had flicker, so I hit the Viewmeter button on my Viewsonic G810 monitor and of course it said 60Hz.
I went to look at the settings in multires and the locked rate for 1024x768 32bit was 100HZ.
I ran the bench again and the same 60Hz appeared.
I rechecked multires and it was still at 100HZ.
I was curious now so I checked 1024x768 16 bit. It was at the default 60Hz.
I locked it at 100Hz and fired up the benchmark.
Now it's at 100HZ!!!!

I'm using the new drivers from nVidia the 40.41's
It would appear that the drivers are overriding 3D Marks 32 bit pallette and using the 16 instead!
Little wonder that 3D Mark scores went up alot on the older GF line of cards.
My 4600 saw a little jump but not near as much as the others because the card is CPU limited.

My system
P4 2.4b Ghz CPU
Epox 4G4A+ motherboard
512 mb Kingmax 400Mhz DDR memory
nVidia GF4 4600
WD 80 ATA 100 Hard Drive w/8mb of cache
DVD, Burner
Audigy Sound

Reply

Kyle:
Well honestly, we dont use all that third party crap to interfere in the benchmarks when running them and will not be looking into it. And I still really don't understand your explanation. If you have it set for 60/16 it seems as though that is what it is going to run in.

My last try:

Me:
Sorry to hear that.

1024x768 16 bit set to 60Hz in XP
1024x768 32 bit set to 100Hz in XP
refresh rates are locked.

Benchmark ran 60 Hz with 3D Mark 2001 SE running at default settings. Which should be of course 32 bit (100 Hz)

If the refresh rate is set to 75Hz in 16 bit mode, the resulting benchmark runs at 75Hz.
Therefore the bench is giving bad results as the the default settings in 3D Mark 2001 is being ignored, it runnings in 16 bit mode.

If someone can verify this I would appreciate it.
 

JAV1

Junior Member
Aug 16, 2002
15
0
0
SilverBack,

FANTASTIC! I've come to the exact same conclusion & can't seem to get anyone to verify it either. The simplist way for the fill rates, high poly counts & vertex shaders to improve in 3D is to go to 16-bit color. That in turn affects the Nature & Drag scores but has little effect elsewhere.

I'm trying to get ppl @ 3D Mark to try it & ...

My problem: I haveWin 98 & I can't presently try the 40.41. I am downloading the file & I'm looking for: nvdd.32, nvarch32 & nvinst.32 dll's.

I think you are right in what has been done! Can you come to 3D Mark & spell it out for others & maybe we can confirm it ourselves!



JAV
GBA!
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
JAV1, if you haven't tried the 40.41's, how can you confirm Silverback's findings?
Silverback, your results seem a little too 'good' to be accurate. If it is true, it would be an interesting
situation. I just can't think of Nvidia doing such a low-down dirty thing; especially now that people are aware of
software tweaking.

Mr. Bennett probably declined to investigate because of the state of events following the van smith thing.
Entech's Multires ("3rd party crap?") is an unnecessary factor in your analysis. You gotta disable it to make sure it isn't conflicting with nvidia driver or 3dmark settings. what you can do to see if this is true is to run 3dmark at 16 bit and 32 bit settings. do you get the same results? if so, your finding will be supported, if not...

very perceptive though!;)
 

JAV1

Junior Member
Aug 16, 2002
15
0
0
As I said: I can't confirm it directly. Here's what I have done tho': run the 30.82 on 3D Mark @ 16 & 32 & see if it has the same affect. It does.

I've also been comparing others 16bit color non-40.41 scores vs 32bit color 40.41 scores. Same pattern.

I wish I could confirm it & didn't have to ask others to test my theories, but that is the boat I'm in.

You, uh, couldn't try to confirm/deny it Gururu?
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
It would appear that the drivers are overriding 3D Marks 32 bit pallette and using the 16 instead!

Errr, no offense but this sounds like bunk for the following reasons.

1) IIRC, nVidia's memory bandwidth compression scheme (Lightspeed II) is only enabled when using 32bpp textures so the benifits of dropping to 16 bit would be offset by the lack of additional memory bandwidth.

2) If they were using 16bit pallet instead of 32bit pallet, it would be easily detectable via screenshots. Can you tell it's 16bit during 3dmark? It should be fairly obvious.

3) If they were using 16bit pallet instead of 32bit pallet, performance in tests other than nature test would increase, but from the data I've seen the only substantial, repeatable improvement is in the nature test.

Update -

Some Jabroney with 40.41
Same Jabroney with 30.82

Fill Rate (Single-Texturing) 897.0 / 902.8 (v30.82 / v40.41)

Fill Rate (Multi-Texturing) 1930.0 / 1940.1 (v30.82 / v40.41)

Wouldn't Multi-textured fill rate demonstrate this 16bpp trick?


 

Jeff7

Lifer
Jan 4, 2001
41,596
19
81
Originally posted by: merlocka

2) If they were using 16bit pallet instead of 32bit pallet, it would be easily detectable via screenshots. Can you tell it's 16bit during 3dmark? It should be fairly obvious.

Good idea - can anyone get them? I seem to be unable to get screenshots in WinXP - PrintScreen, and Alt+Shift+PrintScreen don't seem to do it. Am I doing it wrong?


Edit: I just found a good test for someone to get screenshots of (for GF4 Ti owners): Advanced Pixel Shader. In 16-bit color, there is obvious banding in the water. In 32-bit, the banding is gone.
 

SilverBack

Golden Member
Oct 10, 1999
1,622
0
0
Ok Jeff7
I tried to do a comparrison shot.
By hitting F12 during the bench it screens captures to the 3D Mark 2001 directory.

I ran both tests at both color bits
How I got the single shot.

Brought up both pics in a different paintbrush window.
I cut the right half side of the first pic, moved to the second window and did the same.
I pasted the second cut back in, and saved the pic.

I then converted the file to jpg as it was a pretty large file size.

Comparrison shot.
 

jbond04

Senior member
Oct 18, 2000
505
0
71
Well, I guess that settles it then. The difference between the 16 and 32 bit color depths are quite different. False alarm! Move along here. :p

That was some good detective work, though!
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
Have you made sure Direct x is running at the highest refresh ? I am betting not. Type dxdiag into the run program command, goto more help and then click on the 'override' tab, now type in the refresh rate to force directx to use.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
If it is nature and drag that see the major improvements in score, then why are we looking at a scene that suposedly doesn't see as drastic of an improvement?
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: bunnyfubbles
If it is nature and drag that see the major improvements in score, then why are we looking at a scene that suposedly doesn't see as drastic of an improvement?

Assuming that nVidia is not only switching to 16 bit, but somehow only doing so for 2 of the 4 tests of 3dmark?

I've got a crazy theory about this... perhaps the nVidia PR regarding the drivers is *true* and the only benchmarks which will see real improvement are vertex and pixel shader limited...

A lot of the improvements in performance are due to improved efficiency in the driver code that effect vertex and pixel shaders. That is why the jump in Nature. A big jump can be seen in Aquanox, too, another app that uses vertex shaders and pixel shaders. Apps that use pixel and vertex shaders also just happen to be the ones that are popular for benchmarking. Games that do not use vertex and pixel shaders may not see as significant of an improvement as games that do.

Brian Burke
NVIDIA Corp.

Also, regarding the issue with AF after it's been changed from default setting.


The default setting on the new driver and the old driver are both the same. The problem that is being identified is a bug that occurs when the AF settings are changed. It seems that once the AF slider is moved away from 0 to another setting, then back to 0 again, it triggers a bug that causes the control panel to misread the register setting. That setting is not point sampling, but it is something less than 0. The only way to "reset" back to true 0 is by reinstalling the driver or deleting the registry setting. We are going to fix this bug and others and submit the new BETA driver to WHQL, before the driver moves from BETA to "official".

Brian Burke
NVIDIA Corp.
 

JAV1

Junior Member
Aug 16, 2002
15
0
0
Merlocka,

Thanks for the links & the info. I have to admit that the links you gave do tend to counter the 16bpp 'theory'.

As for the 16bpp affecting every test in 3D Mark: it doesn't. Run Car Chase &/or Lobby @ 16 & 32 & the 'improvements' mimic the 40.41 'improvements' > none to slight. 16bpp will affect the other tests more than the 40.41 & doesn't seem to give the big jump in Nature that is being seen. I am leaning toward the possibility that 16bpp isn't the answer. (24 bit ???)

Here is what is screwy, to me: vertex drops w/the 40.41's yet that is supposed to be optimized. Also if nVidia has this on their website (concerning the GF4 MX's): "Complete Direct X support, including DX8.1." Then why aren't those cards showing any gains? They have the NSR engine & LMAII, so shouldn't there be comparable gains ( vs. GF3/GF4Ti) for the clock speeds & memory bandwidth?

Appreciate the input,
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: JAV1
Merlocka,

Thanks for the links & the info. I have to admit that the links you gave do tend to counter the 16bpp 'theory'.

As for the 16bpp affecting every test in 3D Mark: it doesn't. Run Car Chase &/or Lobby @ 16 & 32 & the 'improvements' mimic the 40.41 'improvements' > none to slight. 16bpp will affect the other tests more than the 40.41 & doesn't seem to give the big jump in Nature that is being seen. I am leaning toward the possibility that 16bpp isn't the answer. (24 bit ???)

Here is what is screwy, to me: vertex drops w/the 40.41's yet that is supposed to be optimized. Also if nVidia has this on their website (concerning the GF4 MX's): "Complete Direct X support, including DX8.1." Then why aren't those cards showing any gains? They have the NSR engine & LMAII, so shouldn't there be comparable gains ( vs. GF3/GF4Ti) for the clock speeds & memory bandwidth?

Appreciate the input,

Regarding "Complete DirectX support, including DX8.1" , I think that's just nVidia's marketing department playing games. They (everyone, not just nVidia) tend to be fuzzy about "supporting" a DX version, and actually being compliant. nVidia sayinig the Geforce4 MX supports DX8.1 means (to me) that if you have DirectX8.1 installed, the card will still work :) I don't think they are implying that the Geforce4 MX is DX8.1 compliant because it's pretty obviously a DX7 card.