Interesting read about nVidia's hardware bug

Skinner2

Member
Dec 10, 2000
99
0
0
Check out the article
HERE


I'm curious to see how much the card slows down once the fix is implemented into the drivers... Anybody else read the article?
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
I was not aware of any hardware bug. The problem is with compressing small textures with s3tc(made by s3) it's not nVidia's fault and it's not a bug.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
did you read the article? the bug is that the textures are being converted to 16bit during the compression.
 

miniMUNCH

Diamond Member
Nov 16, 2000
4,159
0
0
Dark 4ng3l:

The article is about DXT1 in NV10 and NV15, and it is nVIDIA's fault. Radeons and V4/5's handle DXT1 compression just fine.

This is why I dislike nVIDIA; did they ever mention this problem? Hell no! Would they? Hell no! nVIDIA runs a propaganda campaign but their cards blow. Well...this is, at least, some of the reason why the image quality for NV's blows (IMO) on a lot of games.
 

themadmonk

Senior member
Sep 30, 2000
397
0
0
The article clearly states that it is an nVidia problem. How can it be an nVidia problem if they don't use DXT1 compression. It even goes to show you pictures of the GeForce problem, and nVidia doesn't use it? Be real, it is a problem that nVidia has.
 

2dfx

Member
Sep 3, 2000
36
0
0
Hmm, GF cards convert textures to 16bit with DXT1 on, and hardware reviewers leave DXT1 on on while reviewing cards with q3, even though it looks like crap.

Could it be that nVidia was cheating with Q3 benchmarks and possibly others?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Could it be that nVidia was cheating with Q3 benchmarks and possibly others?

That's a very good point. It's already known that nVidia are using various other cheats in the Detonator 3 drivers (ie a forced 16/20 bit Z-buffer) to improve performance in high res/32 bit colour situations. This could be yet another example of this.
 

MGallik

Golden Member
Oct 9, 1999
1,787
4
81
Good mornin' folks. :)

Next thing you know, people will start calling
all those nVidia driver releases what they
truly are, patches. ;)
 

pidge

Banned
Oct 10, 1999
1,519
0
0
OMG BFG10k

Have you taken a look at ATI's drivers? Or what about 3dfx? ATI is well known to include special catches in their drivers which searches for benchmarks and sets up special tweaks for that particular benchmark. Anyone remember ATI's Turbo drivers? You all remember what they were, don't you? ATI's drivers also force a 16-bit Z buffer which greatly helps with benchmarks. Don't give me this suprised look as if NVIDIA is the only one who tweaks their drivers.
 

Hawk

Platinum Member
Feb 3, 2000
2,904
0
0
Yeah, but this thread is about Nvidia's method of displaying DXTC and not how Nvidia and ATI and 3dfx cheat in other ways. Oh, and please tell me more about these turbo drivers, I don't have any knowledge/experience with them.
 

pidge

Banned
Oct 10, 1999
1,519
0
0
A very long time ago, ATI released "Turbo Drivers" which offered 40% increase in speed for their Rage Pro line if video cards. Then later when they were disected, it was found that the drivers actually contained special code which changed your driver settings without you knowing to score a high score on a benchmark. Then after the benchmark was over, it reverted back to its previous settings. In real world games and applications, those drivers were actually slower than their previous drivers.
 

WyteRyce

Member
Apr 16, 2000
80
0
0
Why is it that my GTS can run UT and Star trek:Elite Force just fine with texture compression on?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Pidge:

ATI is well known to include special catches in their drivers which searches for benchmarks and sets up special tweaks for that particular benchmark.

That's hardly the same thing as having defective DXT1 hardware, forcing 32 bit textures into 16 bits when compressing them and using a forced 16 bit Z-buffer.

Also does ATi's/3dfx's image quality blow like nVidia's when those tweaks are applied? I think not. And if you don't like the tweaks with ATi/3dfx you can turn them off.

ATI's drivers also force a 16-bit Z buffer which greatly helps with benchmarks.

I doubt that *very* much, especially since there are settings in their drivers in both OpenGL and Direct 3D change the Z-buffer settings. And this link clearly shows the default settings are not forcing a 16 bit Z-buffer.

You can see that ATi even have a setting to force 32 bit textures into 16 bit textures, which just happens to be what nVidia are already doing with their DXT1 compression scheme.

How do you disable nVidia's forced 16 bit Z-buffer? How do you disable nVidia's forced 16 bit textures in DXT1? Registry hacks probably because in these Direct3D and OpenGL screen shots I don't see anything to change any of those those tweaks. In fact nVidia don't even tell you that they're tweaking something.

DaveB3D was right to tweak the V5 in his reviews. nVidia are quite clearly doing so already and not even telling anyone, plus you have no choice in turning them off if you wish (unless you know how to hack the registry). Plus Dave's tweaks have no effect on the image quality, unlike nVidia's tweaks.

Don't give me this suprised look as if NVIDIA is the only one who tweaks their drivers.

This is "tweaking" to the point where you have unacceptable image quality, and you don't even have a choice not to use the tweaks. All nVidia care about is getting the highest towers in the framerate bar graphs. And bizarrely not one of the reviewers seem to care about the sub-par image quality. IMO nVidia's boards should all be benchmarked with S3TC turned off and compared to everyone else with S3TC turned on.

A good quote from the article

Thanks to his pioneering investigative work, we finally know the real reason behind this problem that nVidia and id were silent about.

That means no more pointing the finger at Carmack. It's a hardware fault with nVidia.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
who gives a crap about dxt1. If you dont like it turn it off. You could also use the patches that fix the q3a sky problem if you want texture compression. Or, you could simply TURN OFF ALL FORMS OF TEXTURE COMPRESSION. The geforce will still beat the radeon with texture compression off.
 

themadmonk

Senior member
Sep 30, 2000
397
0
0
The article is about the hardware part of the problem, not whether the drivers force one to another. It is a hardware problem and talking about ATi and 3dfx forcing anything does not say whether their hardware had a defect such as that which nVidia shows. Even an nVidia representative admitted that there is a hardware problem, he never mentioned anything about it being a driver problem until (after the fact) he admitted there is a hardware problem.
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
<Robo stands to the side, with a wry smirk, knowing that he's been bitching about this NVIDIA HARDWARE PROBLEM for several months now>

some people will say &quot;I hate to say 'I told you so' &quot;

I make no bones about it. I LIKE to say &quot;I told you so&quot;

<more arrogant smirking from Robo>

what's funny is that Ben and I have had a few wars about this. Ben writes for gamebasement.

heh.....that's who posted the article about S3TC being jacked up on nvidia cards. gamebasement.

WH000000T!

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Dark4ng3l:

Hmm nVidia does not us dxt1 compression. Only the v5 and radeon(I think) use it.

There are 5 DirectX compression techniques as defined by the Microsoft Direct3D API (DXT1 - DXT5). nVidia support all five of them. S3TC supports a subset of the 5 methods (only the first three I think).

WyteRyce:

Why is it that my GTS can run UT and Star trek:Elite Force just fine with texture compression on?

Because the game is likely not using the DXT1 texture compression technique. It's probably using one of the other 5 available algorithms.

Sudheer Anne:

who gives a crap about dxt1. If you dont like it turn it off. You could also use the patches that fix the q3a sky problem if you want texture compression. Or, you could simply TURN OFF ALL FORMS OF TEXTURE COMPRESSION.

You're missing the point by a mile.

The geforce will still beat the radeon with texture compression off.

I doubt that *very* much.

EDIT: There are 5 DirectX compression methods, not 6. My apologies.
 

Compellor

Senior member
Oct 1, 2000
889
0
0
Actually, Quake 3 runs fast on my system WITHOUT texture compression turned on -- with the exception of a few maps, like Q3DM6, Q3DM9, Q3DM11 and Q3DM15. But, even then, it's playable. I run it with everything set to the max, too. Who the hell looks at the sky when they're busy fraggin' everything that moves? A perfectionist?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Compellor:

Actually, Quake 3 runs fast on my system WITHOUT texture compression turned on -- with the exception of a few maps, like Q3DM6, Q3DM9, Q3DM11 and Q3DM15.

Compellor, you're missing the point. All most of the reviewers and the readers care about is the framerate graphs and the fact that nVidia is at the top of them. Remove S3TC from nVidia's equation and nVidia's boards will no longer be at the top.

Forcing 32 bit textures into 16 bits (for obvious performance gains) when all of the review sites benchmark Quake 3 with S3TC turned on? That's just as sneaky and deceptive as introducing a forced 16 bit Z-buffer into the Detonator 3 drivers to beat the Radeon.

The point is that nVidia's framerates with S3TC turned on can't really be compared to the competition. They are completely invalid.
 

Taz4158

Banned
Oct 16, 2000
4,501
0
0


<< ATI's drivers also force a 16-bit Z buffer which greatly helps with benchmarks. >>


Another blatant untruth from the desk of Pidge. You're on a roll buddy!
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG-

There are so many errors in your posts in this thread I don't want to go point by point but one is so far that out there it is Hardware like. nVidia does not use 16bit ZBuffer, they are by far the best in this aspect and do not have the rather serious flaws that ATi displays in this particular area with the Radeon. Another quick point, you take on average a whopping 2FPS hit using DXT3 instead of DXT1(running Quaver, not Timedemo1), the problem is that boards that don't support all the methods, such as the Radeon, become extremely corrupted when you enable it.

Robo-

Don't think this is the final word:)
 

pen^2

Banned
Apr 1, 2000
2,845
0
0
for all i know pidge is telling the truth.. those 'special' rage drivers improved some that cheesy zdnet 3dbench thingy (forgot the name of it, not that i care) by 40% as pidge has mentioned, but brought degraded performance for all real world 3d games/apps... go figure...
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
Uh didnt we all come to the conclusion before that the Radeon used a hack by not compressing 128x128 textures to avoid the S3TC compression problem. That article dosent seem to be correct. S3's own boards had the same problem. ID is compressing lightmaps in Q3 when they shouldent be.

BFG: You believe MrNSX at arstechnica about the NVidia 20bit zbuffer thing? HAHA. Why is NVidia's Zbuffer so much more accurate than ATI's if its a 20 bit zbuffer?