I'm sorry Nvidia.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
ATI cards follow DX9 specs. OpenGL requires higher Specs.

Here is a Quote from the article:

A trilinear filter is a linear interpolation between two bilinear samples, requiring a weight between 0 and 1. ATI allocate five bits for this weight, which matches Direct3D's reference rasterizer (however, higher precision is allowed by Direct3D and in fact desirable). In OpenGL, SGI - who spearheaded the inception of this API - use eight bits. That's also the standard that's followed by, eg, Nvidia's GeForce range that implements the 8 bit linear interpolation weight for both OpenGL and Direct3D.

This shows that Nvidia has better IQ than ATI.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
when nvidia did it they did it for a particular program (3dmark), this is cheating

Do you honestly believe that? Application specific optimizations is cheating? I'd like to get people on record on this point, just so they can be reminded of it in the future(I've always thought it was an absurd assertion, dating back to the Voodoo1 days- yes, 3dfx had loads of app specific optimizations).

Would you say that ATi's design choice has created a totally unusable feature like nVidia's S3TC implementation did?

Use S3TC3 wasn't too complicated, at least I never thought it was. I'm not in here flaming away at ATi, can't bring myself to be a hypocrite like so many others I guess, it is within the specs and is inferior to other options, but it is within specs. Sounds awful familiar. Chalnoth's posts over at B3D bring up some good points, the driver guys over at B3D were adamant that they were selecting LOD bias properly and this article brings up an issue that could well explain why ATi has so much more texture aliasing then the nV parts.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
how does ATI sample more texels. ATI is 8x1, and Nvidia is 4x2. Equal amount only that Nvidia is clocked higher. What am I missing?
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
To do it specifically for Benchmarks means they are trying to falsely win over buyers. Because it shows different performance in real games.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: BenSkywalker
when nvidia did it they did it for a particular program (3dmark), this is cheating

Do you honestly believe that? Application specific optimizations is cheating? I'd like to get people on record on this point, just so they can be reminded of it in the future(I've always thought it was an absurd assertion, dating back to the Voodoo1 days- yes, 3dfx had loads of app specific optimizations).

So you're saying that the newer drivers from both ati and nvidia do not over ride the application preferences?

What kind of writing technique you using Ben? Bate and switch? Place doubt, then go in for the kill. I like it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So you're saying that the newer drivers from both ati and nvidia do not over ride the application preferences?

They absolutely do and this is not remotely close to new, it has been done for the past seven years(since the Voodoo1 at least, not sure if the earlier 3D 'accelerators' did it too but they may have). Pretty much, it always has been done.

What kind of writing technique you using Ben? Bate and switch? Place doubt, then go in for the kill. I like it.

Not at all. PowerVR uses app specific optimizations for d@mn near every game you ever heard of, and they have never hidden that fact. If anyone here has a Kyro/Kyro2 in a rig right now(I do, not one of my rigs but I have it here at the moment anyway) simply go in to the registry if you have any doubts and look around.

I want people to either come out and say 'I am a complete hypocrite' or define themselves a clear standard and stick with it. I have done so numerous times, and it would really cut down a lot on the flaming in these forums. It would be pretty hard for people to flame one company one month for doing something, and then say its alright the following month when there company is doing it. The fanboys would have no place to hide if they would simply define their standards up front, an we could significantly improve the signal to noise ratio around here.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
OpenGL requires higher Specs.
It doesn't say that at all. It says SGI recommends 8 bits.

I also find it rather bizarre that you're so fixated with such a trivial issue - an issue that the writers themselves said was impossible to spot without massive magification - while ignoring the numerous blatant and visible in-game cheats that nVidia has been employing for many months.

how does ATI sample more texels. ATI is 8x1, and Nvidia is 4x2. Equal amount only that Nvidia is clocked higher. What am I missing?
Pipeline configuration is irrelevant to the maximum AF setting. ATi's maximum setting can sample up to 128 texels while nVidia's maximum setting can only do 64 texels. I guess that means nVidia is cheating using your logic, huh?
rolleye.gif


(Note: for the record I don't think nVidia is cheating with their 8x - that was their design decision. Also adaptive AF isn't cheating either).
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I did comment on Nvidia's optimization. My arguement has been completely shot down. I will find a standard I guess like Ben said.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Use S3TC3 wasn't too complicated, at least I never thought it was.
The complexity of the DXT3 switch is irrelevant. We're comparing the usability of two features that follow a given spec.

I'm not in here flaming away at ATi, can't bring myself to be a hypocrite like so many others I guess, it is within the specs and is inferior to other options, but it is within specs.
I never criticized nVidia for following the S3TC spec, I criticized them for producing an unusable feature that was soon disabled/changed when the users saw it, thus in some way artifically inflating the benchmark results. ATi's filtering isn't unusable and in fact overall their image quality tends to be superior to nVidia's in a lot of cases, cases that have been proven many times in a wide range of IQ reviews.

could well explain why ATi has so much more texture aliasing then the nV parts.
I admit that ATi does have a tad more aliasing than nVidia but I also find their AF and AA are superior to nVidia's and also that high resolutions tend to clear the aliasing up quite well.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Who would really care if anyone cheats as long as they can't tell? In that case isn't that just the company doing a good job and bringing you the best performance possible without significantly sacrificing quality to the point where it is completely obvious?

The reason anyone would get so worked up over such trivial issues is clear sign of fanboyism at work. An unbiased person clearly wouldn't care either way if they couldn't tell from card to card which one might or might not be cheating as long as the card does what they expect it to, and in some cases more than what they expect it too which is what we've been given recently and now we're getting so greedy we have to have flame wars about cheating to determine which company is the "winner" to justify our hefty purchases. You don't see flame wars starting over what NIC or DVD ROM drive you use now do you? Too bad video cards don't cost $40 at max because then we'd never see this garbage to such an extent.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I never criticized nVidia for following the S3TC spec, I criticized them for producing an unusable feature

The sky isn't blue because it is blue :) Because they followed the S3TC spec you consider it unuseable. The only version of the implementation you have seen that you consider useable does not follow the spec, it exceeds it. Therefore, you are directly criticizing nVidia for following the spec. The S3TC spec calls for what nVidia provides, and you get it with nV. You get superior results with ATi, which went above and beyond the spec to give a superior image.

ATi's filtering isn't unusable

Arguable. I find their filtering to introduce more IQ problems then nV's S3TC3 implementation did for certain. For anyone that would question it, I've been complaining about the filtering on ATi's products for a long time now, I thought my R300 based board had clearly inferior texture filtering to the NV25. I have been under the assupmtion that much of this was due to a more agressive LOD bias setting, but it appears that this may not be the case.

ATi's filtering isn't unusable and in fact overall their image quality tends to be superior to nVidia's in a lot of cases, cases that have been proven many times in a wide range of IQ reviews.

They almost always compare the appearance of a more agressive LOD bias, which could very easily be caused by their inferior filtering showing more aliasing artifacts then their competition.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Pioneer DVD-ROM drives are better than Samsung.

Weren't there issues with S3TC in some games, how are you saying that it was unusable, when I had to turn it off, so I could get a better sky in one of my games.

One of my questions were not answered, because NV has higher than normal specs, doesn't that mean that NV has higher quality than ATI. In an article linked in the beginning of the thread, it showed that the NV25 had better filtering both the NV30 which had smoother filtering than the R300, but the R300 had better transitioning in the MIP-MAPs. Well, better IQ doesn't seem to be the case. What happened to the greatness of the NV25 filtering.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
A lot of this is me trying to figure out who is better. Who I'm gonna support. I'm not gonna support someone who isn't worthy of my support. Right now, it's very hard to tell. Both have their problems.

I want the graphics card that will show me the game as it is intended. With IQ that isn't lower than what was intended to give. Take for instance how I customize my V5 PCI drivers.

I play at 1024x768x32.

Global Settings--
AGP Command FIFO - Disabled
Highest Quality Video - Enabled - Comes with negligible aliasing.
Refresh Optimalization - Disabled - Better image quality when disabled.
VIA Chipset - Bios default

Direct3D--
2 Pixel-Per-Clock Rendering - Enabled - Questionable. What do you think?
3D Filter Quality - Automatic
Alpha-Blending - Automatic
D3D Guardband Clipping - Disabled
Anti-Aliasing - Fastest Performance - Not enough fill rate for high res or AA an any sample!
Edg-Aliasing - Disabled - Early version of above that works on driver v.1 only.
Maximum Buffered Frames - 1 Pending Buffer
MIP Map Dithering - Disabled - Multi-Texturing not allowed with this Enabled- sucks.
Rendering Color-Depth - Software Controlled
Scan-Line Height - 1 Line (Default)
Speed Settings - Vsync off - More frames and response time.
Trilinear Texture Filtering - Disabled - Multi-Texturing not allowed with this Enabled- sucks.
Z-buffer Optimization - Enabled - No compatability or IQ loss issues.

OpenGL/Glide--
2 Pixel-Per-Clock Rendering - Enabled - Questionable. What do you think?
3D Filter Quality - Automatic
Alpha-Blending - Automatic
Depth Precision (16 Bit) - Disabled - Lowers IQ and compatability when enabled.LOD Bias - 0 - Best quality LOD comes with extreme texture aliasing.
Depth Precision (32 Bit) - Disabled - Lowers IQ and compatability when enabled.
Anti-Aliasing - Fastest Performance - Not enough fill rate for high res or AA an any sample!
Edg-Aliasing - Disabled - Early version of above that works on driver v.1 only.
Force 16-bit Textures - Disabled
Glide Splash Screen - Enabled - It looks cool.
Hidden Surface Removal - Disabled - Don't remember why, but I know it sucked.
Legacy Texture Compression - Enabled - But doesn't seem to do any good.
LOD Bias - 0 - Best quality LOD comes with extreme texture aliasing.
Maximum Buffered Frames - 1 Pending Buffer
MIP Map Dithering - Disabled - Multi-Texturing not allowed with this Enabled- sucks.
OpenGl Guardband Clipping - Disabled
Rendering Color-Depth - Software Controlled
Screen Capture Hotkey - Disabled
Speed Settings - Vsync off - More frames and response time.
Triple Buffering - Disabled - Hah, as if I had the Mem to spare.

Peformance is there at high IQ and Compatability. From what it seems to me is that ATI has the IQ and NV has the Compatability. What a choice.
Any recommendations besides a good psychiatrist.

Who do you think that Brand is?