Nvidia accused of cheating over graphics tech

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Mem
Nvidia released those drivers with the 6800ultra to be reviewed


Regardless those drivers will only be what Nvidia can release at the time,it does not mean they are 100% perfect,I`ve yet to see any company release a driver with a new video product that`s near perfect.Nvidia will improve the drivers for the 6800 that`s a fact.

Drivers are always evolving,especially with new products,are you going to worry about a so called "cheating driver " on a product that`s not even out yet?

Bottom line it`s too early to judge IMHO,especially for the reasons I`ve listed above.


Yes I agree, to early to judge, but at the same time I appreciate the flak driverheaven has taken to point out potential problems. I am hoping stuff like this will keep the industry honest. It is not driverheavens fault that Nvidia released a product for review with immature drivers. If people only want to read positive reviews, go to nvidia's homepage.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
3D Performance with Far Cry - Part 2 NVIDIA


After further analysis with Far Cry, it turns out the game isn?t quite the technological tour de force we suggested in part 1 of our 3D Performance with Far Cry article. In that article, we highlighted the game?s use of 2.0 shaders, and while these shaders are used in the game, it turns out that they?re not used as extensively as we suggested. In fact, from what we can tell, Far Cry mainly uses 1.1 shaders to achieve the jaw-dropping visuals we discussed in the intro.

In laymen?s terms, this means that NVIDIA?s three-year old GeForce3 GPU is capable of reproducing most of Far Cry?s brilliant eye candy
, although at a significantly reduced frame rate.

Don?t believe us? Let?s take a look at one of Far Cry?s best features, its water:

. . .

Recently there?s been some controversy surrounding a set of screenshots NVIDIA sent to the press involving shader model 3.0 and Far Cry. In the screenshots, NVIDIA provides a comparison of 1.1 shaders to 3.0 shaders. According to NVIDIA: ?The following before-and-after images are from the CryEngine. Developer Crytek uses Shader Model 3.0 techniques (vs. 1.x shaders) to add more depth and realism to the scenes. Notice the more realistic look and feel when SM 3.0 is applied to the bricks and stones that make up the staircase. In the scene featuring the Buddha, the full image comes to life with the use of SM 3.0, with the technique applied to multiple objects.? (Editor?s Note: The preceding quote came directly from NVIDIA?s email to members of the press which accompanied the screenshots):

. . .


Clearly you can see that the water in NVIDIA?s screenshots doesn?t match the output we?ve just shown you using 1.1 shaders and NVIDIA?s GeForce4 GPU. In fact, they resemble Far Cry?s low quality water mode more than anything else (which also happens to use 1.1 shaders). As we discussed at length in our GeForce 6800 Ultra article, the main additions of shader model 3.0 are more instructions and dynamic looping/branching, which is designed to make life easier for content developers and can handle certain operations more efficiently than shader model 2.0 (we mentioned some examples in the preview article). In some cases this looping/branching can also bring improved performance, but can also hinder performance if coded improperly.

The other main highlights of shader model 3.0 have been present in ATI?s DX9 hardware for some time now, with the exception of FP32 support.

So what does Far Cry use 2.0 shaders for? Apparently just lighting, everything else is handled by 1.1 shaders.

With this in mind, we?ve rounded up one dozen GeForce cards, ranging from the GeForce4 Ti 4200 to the GeForce 6800 Ultra. We?ve also run numbers comparing the performance of Far Cry 1.0 with Far Cry 1.1, which contains performance optimizations for GeForce FX cards.
Hope this clears things up (worth looking at the article). :)