What in the world is up with Nvidia?!?!

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I had forgotten about this thread-

LT-

Old reviews with old drivers, Ben?

As were the numbers you linked to. Try and find some recent numbers on other sites using the boards we are discussing.

BFG-

The 5950 has 50% more memory bandwidth than the 9800 Pro. Taken into this context it's astonishing that it's not coming out further ahead.

Does the R3x0 kill the NV3x at shaders or not? If it does, then why would the advantage in mem bandwidth be enough to overcome the enormous rift in shaders if they were important? Either 1)- Shaders aren't that big of an impact or 2)- The R3x0 isn't that much faster then the NV3x running shaders. I'm saying it's number one.

Again, you're the only one complaining about the definition of shader loads and attempting to drop that onto me is simply a strawman. All I'm saying is that those games have some shaders in there and the R3xx architecture runs them better than NV3x architecture.

You claim the R3x0 is significantly faster then the NV3x at shader performance. I agree with that.

You claim that shaders have been very important in gaming to date. I don't agree with that.

Since we have thousands of benches to draw on- you show me where this great importance is. As I have stated, and I have backed up my assertion- about half of the "shader" games are roughly as fast or faster on NV3x hardware as they are on R3x0. Either you and I are both very wrong about the shader performance of these parts or you are wrong about the importance of shaders. If you are saying you are correct on both points then just show me. I have already been asked to show numbers and I have- the onus is now on you.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Lol... so if none of us can tell the difference and no one uses 32bit precision and what not WHY INCLUDE it?? All it is doing is slowing the HW down.

Also is there any other way fro Nvidia to make AA and AF more efficient. Not sure how ATI does it but i know Nvidias takes up a Shader Unit or an ALU.

-Kevin
 

lordtyranus

Banned
Aug 23, 2004
1,324
0
0
As were the numbers you linked to. Try and find some recent numbers on other sites using the boards we are discussing.
:roll:
My review was at least done in July 04. Bit more recent than October '03 I'd say. Like I posted before, your own link to the xbit 6600 GT review counters your own points.

Not to mention all the games where the 59xx runs limited and/or modified shader paths.
 

IceMole

Senior member
Sep 28, 2004
284
0
71
Originally posted by: Gamingphreek
IM WATCHING A SLIDESHOW :-( :-( :-(

-Kevin

Now you know how I felt when I ran 3dmark03 on my Ti4200 :) She was a good card, had to put her down for a 6800 :D
 

parkbench

Senior member
Feb 14, 2002
206
0
0
Just saw this huge post and wondered if there was anything important. Hey, you guys are going to work for ATI or nVidia soon right? Because if not, you're the saddest bunch I've ever seen! Almost 200 borderline flamewar posts on absolutely nothing. Neither of you even know what your points are anymore. These are computer system videocards, lighten up! Even the engineers at those two companies would be laughing at you by now, sheesh!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Does the R3x0 kill the NV3x at shaders or not?
Yes.

Either 1)- Shaders aren't that big of an impact or
In certain titles most definitely, but I never claimed that every title with shaders also has heavy shader loads.

If you are saying you are correct on both points then just show me.
Tyranus posted a decent link just before. Look through the games - the bulk of the ones that use shaders (even lightweight ones like COD) are faster on R3xx hardware.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Lol... so if none of us can tell the difference and no one uses 32bit precision and what not WHY INCLUDE it??

24bit is completely dead end, everyone including ATi knows this(not just saying that, it was always meant as a dead end). Their R5x0 part will use FP32. nVidia went with the existing IEEE standard which was FP32, ATi and MS decided to go FP24 for DX9(DX10 will be FP32).

LT-

My review was at least done in July 04. Bit more recent than October '03 I'd say.

Err, I posted a variety of reviews- that way there isn't a question of one site fudging numbers to support their IHV of choice.

Like I posted before, your own link to the xbit 6600 GT review counters your own points.

And the other links I provided? I quote numerous sites, not just one, for a reason.

Not to mention all the games where the 59xx runs limited and/or modified shader paths.

And all the game ATi does comparable(along with using modified AF for the majority of titles with no option to disable it). That one is old, ATi was bagged doing the same things as nV.

BFG-

In certain titles most definitely, but I never claimed that every title with shaders also has heavy shader loads.

Then why is it that shaders have been so important? You tell me.

Tyranus posted a decent link just before. Look through the games - the bulk of the ones that use shaders (even lightweight ones like COD) are faster on R3xx hardware.

He pulled up numbers from one review, I pulled them up from numerous reviews. I used multiple reviews for a reason(even managing to dig some numbers up on Xbit).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Then why is it that shaders have been so important? You tell me.
Because the bulk of the games that do use shaders run faster on R3xx - quite significantly at times - than they do on nV3x hardware.

He pulled up numbers from one review, I pulled them up from numerous reviews.
And many numbers from your own review back the pro-shader claims. It's easy to find exceptions but overall the superiority of the R3xx architecture to run shaders is quite clear and has been quite clear for some time.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It's easy to find exceptions

Incredibly easy, so easy it's about half of all of the 'shader' games that are as fast or faster on nV hardware(actually, using your list tilts the percentage even more in nV's favor). You can count the 'shader' games that have a decisive edge on the R3x0 with one hand, with fingers to spare, and even those don't show close to the theoretical edge the R3x0 has the overwhelming majority of the time.

but overall the superiority of the R3xx architecture to run shaders is quite clear and has been quite clear for some time.

No sh!t, when have I ever so much as implied anything else? The point I've been making is that that advantage has been very close to a complete non factor. Minimalistic in a few titles with a couple that show anything remotely appreciable- and this is multiple years after the PR BS about the 'shader revolution' started. It has been a nigh non factor and will remain that way until we see some hardware that is honestly good at handling shaders.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Incredibly easy, so easy it's about half of all of the 'shader' games that are as fast or faster on nV hardware(actually, using your list tilts the percentage even more in nV's favor).
Not really. I don't have time to pick apart every benchmark you've supplied but I'd wager the bulk of the results you're supplying are CPU limited and/or not shader bound.

You can count the 'shader' games that have a decisive edge on the R3x0 with one hand, with fingers to spare,
Nonsense; it's quite easy to come up with 1-2 dozen games where the R3xx has the advantage, sometimes 2-3 times the speed.

The point I've been making is that that advantage has been very close to a complete non factor.
I disagree with that and in fact I'd say that shader adoption has been as fast if not faster than hard T&L. If even OpenGL games based around DirectX 7 tech (Quake 3) have managed to get shaders bolted onto themselves that speaks volumes about the direction of gaming.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Not really. I don't have time to pick apart every benchmark you've supplied but I'd wager the bulk of the results you're supplying are CPU limited and/or not shader bound.

Check for yourself, obviously they are not shader bound as shaders have amounted to almost nothing.

Nonsense; it's quite easy to come up with 1-2 dozen games where the R3xx has the advantage, sometimes 2-3 times the speed.

And you can come up with 1-2 dozen games where the NV3x has an advantage, what does that prove?

I disagree with that and in fact I'd say that shader adoption has been as fast if not faster than hard T&L.

Hard T&L had games supporting it before it even hit the market, two years out and it was standard in nearly every shipping title. If we use the most generous definition of what constitutes a shader, then they have been around since DX6(or DX7 if you want to tighten it up a bit)- prior to hard T&L and we still aren't seeing jack. If you want to tighten up the definition of what a shader is to narrow it down then you move in to the miniscule amount of titles that represent what we are seeing today.

If even OpenGL games based around DirectX 7 tech (Quake 3) have managed to get shaders bolted onto themselves that speaks volumes about the direction of gaming.

Why? EMBM, Dot3 and EMCM were all present in DX7(EMBM was in DX6) and they are all functions handled by shader hardware- numerous titles in the Quake3 era used what can be deemed shaders(you use an extremely loose definition of the word anyway). Saying games will eventually utilize shaders heavily is a given(a point I've never disputed), it's the fact that that is a long time away that we don't seem to come to the same conclusion.
 

imported_Locutus

Junior Member
Oct 22, 2004
5
0
0
LocutusX,

I've been known as Locutus in some places and as Locutus0 or Locutus_of_Borg in most places so no problems there, buddy! ;)
 

imported_Locutus

Junior Member
Oct 22, 2004
5
0
0
Parkbench,
I'm gonna have to agree with you on that one.
I know my first post seems to suggest otherwise, but when I said "near future" I meant literally the next line of Video cards. After ATI has adjusted to a larger die caused by the introduction of FP32 and 128-bit color, then I believe they will once again be as competitive as they were before the GF 6xxx line was released. After all, NVidia was only competitive for a very short time before finally buying out 3dfx, this short time being about one year after the Voodoo2 up to the Voodoo4 release. And after buying 3dfx they simply had no competition yet. It didn't mean they were any good at what they did. Nobody else had anything that was intended to compete in 3d graphics yet.

---Edit---
Rewording of Voodoo2 statement== Reason: It was confusing.
==Previous Wording==
After all, NVidia was only competitive for a very short time before finally buying out 3dfx, namely during the fall off of Voodoo2, and then they simply had no competition yet.
 

imported_Locutus

Junior Member
Oct 22, 2004
5
0
0
I noticed earlier a post saying that a human being cannot tell the difference between 96bit color and 128bit color. I'm afraid I'll have to disagree on that one, but I have evidence.

I was working on getting my computer to play DVD movies on my HDTV now that I have it connected and I noticed that my PS2 looked better than the DVD software that came with my video card. At first I didn't under stand why, but then the DVD that was playing changed scene to a much darker scene. I realized that there were telltale divisions in the shadows where one color abruptly changed to another. I thought, "Ahah! 16bit rendering!" So I figured I'd see what I could do about getting some free or open source vob filters on line and see if they looked better. You would not believe the improvement.

But, wait! There's more!

After a while, I began to realize that, aside from the usual progressive to interlace losses, it still didn't look quite right. When it was an outside or brightly lit scene it looked beautiful, but in dimmer light or indoors it didn't look quite right. That was when I noticed the same occurance. Those same telltale divisions were there. They were just much better blended such that rather than 3 distinct divisions, there were 15 difficult to distinguish divisions. Comparison to the PS2 (or other hardware 32bit DVD player) revealed that the PS2 still looked better.

So here is the breakdown:
16bit = poor blending
24bit = good blending
32bit = superb blending if not as good as it gets

On a side note:
This may explain the haze that NVidia users complain about when seeing the same game played on an ATI card. Because ATI cards are only using 96bit color, shader calculation (rounding) errors and the like, while infrequent at worst, occur more often than in NVidia cards which use 128bit color. These errors have a tendency to "disappear" at certain gamma and brightness settings which are not default on NVidia hardware.

Disclaimer:
No statement made in this post is intended to suggest that either NVidia or ATI hardware is better than the other. The only intent was to introduce both the reasoning behind 128bit and a possible reason behind the "ATI haze" issue.

---Edit---
I have used 128bit interchangeably with 32bit and 24bit interchangeably with 96bit because each of the components in the larger is the size of the smaller. 32bit red, green, blue, and alpha values when put together forms 128 bits. However, that is a bit misleading as it is all displayed to the screen in only as many shades as the screen can actually render.

Disclaimer 2:
These errors have a tendency to "disappear" at certain gamma and brightness settings which are not default on NVidia hardware.

It's a digital signals thing that, to be quite honest, I don't completely understand. I made a D- in "Digital Signals with Multivariable Differential Equations" and I don't honestly intend to have horrific flashbacks to that class.