I'm sorry Nvidia.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: keysplayr2003
Guys, reality check please. None of this truly matters. Really, it doesn't.

This is ATOT, no need to play middle man or peace keeper.

*stabs keysplayr

;)
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
Ha Nostri told me how nvidia purchased the 5th diamond simply to develop a more adaptive AF than ATI (footprint changes shape relative to what the algo determines is the "best" pov).

I'll beleive Ha Nostri rather than a 2 bit hacksite.

rogo
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
What I'm pissed off about is that many people started on Nvidia about bad performance and lowering IQ for performance. Now that everyone has ATI based cards, its different now. Obviously noone considers this cheating, but for what reason? I think it's cheating. It's because nomatter where you go, they gain performance based on their low image quality. Not just selective, Ok, they cheat on every game instead of just benchmarks. And the selective AF they are talking about is not in selective games, but selective angle - that at 16X AF they will use different levels of AF based on certain angles. This is what annoys me. Lower image quality is still there even if noone notices it. If you compress an image and it turns all the dark gray scales into black - that is a loss of image quality. Even though it will look crisper and better, you have now lost detail. That is what ATI is doing - loosing detail. Many people have claimed that they have superior IQ, but in truth they have lower IQ. Same thing happening. Since they do it in Bilinear filtering and that is the basis for Tri and Anis, then Tri and Anis also has a lower IQ.
 

Evdawg

Senior member
Aug 23, 2003
979
0
0
lalalalala..... it doesnt matter, its whatever performs more FPS.... end of story


WHINERS! =P
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Hahaha, look who's changed their mind. And look, you have an ATI card. That's funny.

And by the way, half the tests by Anandtech are Nvidia's territory.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
Isn't the selective AF meant to increase performance by not wasting AF on something you won't notice.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
yeah. Not necessarily since the 16x will not effect that part of the angle. If the angle is too small, 16x wont even touch the angle, so there is no point. Its not that it lowers image quality and gets away with it.

Because of this thing that is now known, it means that the FX lines has better image quality while keeping up with ATI.
 
Apr 17, 2003
37,622
0
76
Originally posted by: VIAN
yeah. Not necessarily since the 16x will not effect that part of the angle. If the angle is too small, 16x wont even touch the angle, so there is no point. Its not that it lowers image quality and gets away with it.

Because of this thing that is now known, it means that the FX lines has better image quality while keeping up with ATI.

really? arent you forgetting ATI's AA is superior to nvidia's?

what about the bilinear vs trilinear issue??
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
How could we have been so UNFAIR to poor innocent nVidia?! I'm sorry too nVidia, have mercy on my evil soul.....

Waaaaaaaaaaaaaaaaa.....god, I'm so sorry, how could I have doubted you?!










rolleye.gif
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
AA is also a fake IQ, since it tries to emulate high res. That brilinear filiter is ghetto too. Your missing the point. You seem to make more of a fuss about Nvidia than ATI. And why is that. I don't own either brand, so I have no reason to be a fanboy.

If what ATI is doing is right, then why isn't clipping allowed. Clipping is highly unnoticeable and so is lower depth precision.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: VIAN
AA is also a fake IQ, since it tries to emulate high res. That brilinear filiter is ghetto too. Your missing the point. You seem to make more of a fuss about Nvidia than ATI. And why is that. I don't own either brand, so I have no reason to be a fanboy.

If what ATI is doing is right, then why isn't clipping allowed. Clipping is highly unnoticeable and so is lower depth precision.

Ok, let's get one thing straighten out. Don't peddle innocence because you may not own either product. It may not match the exact criteria for a "fanboy", but it by no means excuses someone for having bias or prejudice. Having a un bias standpoint is in fact, a hard commodity to prove. There will always be bias and preconception. So please, the next guy who try's to support his theory by only stating that he is not a "fanboy", should be shot in the head (hypothetically stating).
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I agree that in everyone there is a bit of prejudice, but I strive to be as unbias as possible. I have recommended all brands to people and not just stick to any one brand.

Anyway, my question was not answered, by is everyone soft on ATI?

If what ATI is doing is right, then why isn't clipping allowed, it's highly unnoticeable and so is lower depth precision?
 
Apr 17, 2003
37,622
0
76
Originally posted by: VIAN
I agree that in everyone there is a bit of prejudice, but I strive to be as unbias as possible. I have recommended all brands to people and not just stick to any one brand.

Anyway, my question was not answered, by is everyone soft on ATI?

If what ATI is doing is right, then why isn't clipping allowed, it's highly unnoticeable and so is lower depth precision?

because its not cheating, it's optimizing.

when nvidia did it they did it for a particular program (3dmark), this is cheating
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Nvidia is simply more aggressive with their optimizations and are less forthcoming when it comes to announcing or admitting to them. Shady, a good name.

This is why people may seem soft on ATI. The end.
 

Jack4KickAss

Member
Oct 17, 2003
64
0
0
Well, all I can say. Since changing from my FX 5600 to my Radeon 9800. The image quality in my games with AF on is a lot better. Hardly noticed any difference with my FX 5600 with 2x to 8x AF.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
So, its allowed, I'm just trying to understand what is allowed and what isn't. Is FPS still king or IQ.

Even if ATI is at DX9 specs, they are totally ignoring OpengGL, how do you as a gamer feel about that - one of the major API's ignored.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: VIAN
So, its allowed, I'm just trying to understand what is allowed and what isn't. Is fps still king or IQ.

IQ is without a doubt very important; the 9700pro proved both frame rate and IQ to be very important. For a 300 dollar video card to sacrifice IQ to retrieve more than 25-50% performance loss is inexplicable. But whats so painfully obvioues to one, is however so imperceptive to the other.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
If it applies to everything it's an optimization. Applies to just one app then it's a cheat.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I revise my statement from ATI cheating because it follows DX9 Specs. I also understand that Nvidia was trying to sway the buyers to their side by only optimizing benchmarks. That is cheating. But now I see that ATI does not support OpenGL specs.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I'm sorry that I sided temporarily with the enemy when you were accused of cheating. In truth, it was ATI who has always been cheating.
rolleye.gif


Nobody's cheating - AF doesn't even have a spec under Direct3D or OpenGL and thus the way the vendor chooses to implement it is a design decision at the hardware level.

As for the 5 bit issue, the author is conveniently forgetting that ATi can sample up to double the amount of texels that nVidia can at their maximum AF setting. I'd say that makes a bigger difference than 5 vs 8 bit filtering especially since they (by their own admission) needed magnified and unrealistic screenshots to show a difference.

How is lowering image quality for more performance not cheating.
Because it's a design decision based on something that has no spec to begin with.

In essence, if ATI did do full image quality, they would perform slower than Nvidia based cards.
Uh, the FX series doesn't do full AF either.

No, this is cheating, and it is moreso than Nvidia ever did.
Utter nonsense.

yes, but thats cheating ;)
No it isn't.

nVidia got flamed heavily by most sites for following the S3TC specs, is this all that different?
Would you say that ATi's design choice has created a totally unusable feature like nVidia's S3TC implementation did?