Mod, please lock, this is no longer on topic.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Here are the raw Quake 3 rocket smoke screenshots. I had to put the others in JPEG to put them on bbzzdd, because I figured it was easy to tell, even after the lossy compression, what 16-bit did. 16-bit refers to both color depth and texture depth whereas 32-bit is color depth and texture depth also. If you meant something otherwise, let me know. All other settings in the Quake 3 video options are all the way up, and I forced on all AA/AF in the ForceWare control panel.

http://home.comcast.net/~asmatte/rocketsmoke_targa.zip

The HUD gets color-banded and textures get grainy and dithered. The sky actually looks like utter **** in 16-bit (hey it rhymes). It looks like a puddle of mud instead of clouds. However I couldn't tell the JPEGs from the Targas in a million years, unless they were perhaps flashed after one another ten times. Hey, there's a lot worse than whining about how 16-bit looks so much worse than 32-bit. You see that G70 shimmering thread right next door?

(shimmering) The theoretical tests may look bad, but when you see it in game, you can hardly even tell (unless I have no idea what they are referring to, which may very well be the case). Maybe I am annoyed by that and I don't even know it. It affects the 6800 cards also, right? Sorry I went on my own little tangent here.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
You could set the LOD bias on nV parts to -3 and get the same effect(actually better in some regards).
So in otherwords you're claiming ATi didn't really do AF at all, they just adjusted the LOD? And when they adjusted said LOD nVidia's LOD adjustment was still better so therefore ATi doesn't count?

I know you like nVidia Ben but your comments are simply ridiculous.

Did you ever try AF out on a R100 part?
Sure, I just recently played HL from start to finish on one. IQ was surprisingly good compared to modern cards and it had far less shimmering than a 6800U with optimizations enabled.

Massive aliasing on a scale far worse then anything we have seen
This is absolute nonsense, especially when you're basically comparing it to trilinear on a NV1x part since 2xAF is basically too low to do anything useful. You have a problem with ATi's alleged aliasing with your pixel sensitive eyes yet you don't even bat an eyelid at the hideous blurring nVidia's cards were giving you. Those eyes of yours must be selective pixel eyes, as in nVidia's pixels are good and ATi's are pixels bad.

since and extremely disjointed, some directly adjacent textures would go from extremely sharp to extremely blurry.
That did happen but only at very long distances and at that time there weren't too many games around where it would show. That's a far cry to every game on the NV1x that looked blurry as hell.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
*Shudders* The ATI cards were so pathetically slow back then,
Yet the Radeon was faster than the GF2 at its launch, mainly because it was the first card to support hardware memory bandwidth saving techniques. Only after nVidia's new drivers came out did they regain the crown and even then they were still using hideous 16 bit S3TC to win the likes of Quake 3 benchmarks (another IQ flaw Ben dismisses as "the spec" while lambasting ATi about other things, I might add).

you have to wonder how in the hell anyone could enable *any* anisotropic filtering on their cards.
Because the performance hit was extremely small.

That is what is funny about the post
What's funny about your post is that you don't have a clue what you're talking about.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
*Shudders* The ATI cards were so pathetically slow back then,
Yet the Radeon was faster than the GF2 at its launch, mainly because it was the first card to support hardware memory bandwidth saving techniques. Only after nVidia's new drivers came out did they regain the crown and even then they were still using hideous 16 bit S3TC to win the likes of Quake 3 benchmarks (another IQ flaw Ben dismisses as "the spec" while lambasting ATi about other things, I might add).

you have to wonder how in the hell anyone could enable *any* anisotropic filtering on their cards.
Because the performance hit was extremely small.

That is what is funny about the post
What's funny about your post is that you don't have a clue what you're talking about.

Nice way to edit out what I really wrote and pick it apart. First of all, you will notice that I said ATI cards were pathetic until the Radeon. I never said the Radeon sucked in performance... But now that you are bringing this up, I will say that the Radeon was very late to the market, perhaps a little "too" late.

Second, 16XAF performance hit is not "incredably small" and especially not when cards were memory bandwidth starved back then. I am not sure where you are getting the idea that 16XAF takes an "extremely small" performance hit... But, you are obviusly pro ATI and Ben is Pro nVidia... Both of you are arrogant extremists, though you even more than Ben, by a large margin.

Third, I do indeed have a clue. Except, unlike you, I do not paint myself in an expert light and talk down to people that do not deserve it. You were simply offended that I mentioned the fact that ATI had 16XAF didn't really matter, because very few, if anyone used it.



 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
First of all, you will notice that I said ATI cards were pathetic until the Radeon.
What gave you the impression anyone was talking about the Rage series? Why even bring it up for that matter?

Second, 16XAF performance hit is not "incredably small"
Yes it is.

I am not sure where you are getting the idea
It's called benchmarking.

You were simply offended that I mentioned the fact that ATI had 16XAF didn't really matter
I wasn't offended, your comment was simply nonsensical. I think you're getting confused with nVidia who didn't really get free AF until the FX series three generations later and didn't get 16xAF until the 6xxx series which came four generations later.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
First of all, you will notice that I said ATI cards were pathetic until the Radeon.
What gave you the impression anyone was talking about the Rage series? Why even bring it up for that matter?

Second, 16XAF performance hit is not "incredably small"
Yes it is.

I am not sure where you are getting the idea
It's called benchmarking.

You were simply offended that I mentioned the fact that ATI had 16XAF didn't really matter
I wasn't offended, your comment was simply nonsensical. I think you're getting confused with nVidia who didn't really get free AF until the FX series three generations later and didn't get 16xAF until the 6xxx series which came four generations later.

1. I brought it up, because ATI didn't have the Radeon until very late. Therefore, NVX was competing against ATI's plenty of rage cards...

2. I am not sure what you consider small. Look at the 1600X1200 benchmark at this link

You will need to look the higher resolution benchmarks to get an idea on this performance hit. If you think a 38.2% - 43.3% performance hit from running 16XAF is "extremely small" then I am not sure there is help for you.

3. See above.

4. See above for this "free" performance hit. Do you always try and sound like a arrogant ass?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So in otherwords you're claiming ATi didn't really do AF at all, they just adjusted the LOD?

They certainly weren't doing anything close to real AF. Simply try changing the x, y and z axis at the same time and take a look- insane levels of aliasing.

Sure, I just recently played HL from start to finish on one.

Try anything with some decent textures? You could use a straight LOD bias adjustment with 2x AF enabled on a GF and HL looked fairly decent.

You have a problem with ATi's alleged aliasing with your pixel sensitive eyes yet you don't even bat an eyelid at the hideous blurring nVidia's cards were giving you.

It isn't nV v ATi- it is correct and incorrect. nV's current parts, while still being vastly superior to the R100 are grossly inferior to the NV2x parts. Their current parts are incorrect, it doesn't matter if it is ATi or nVidia nor would it matter if 3dfx ever actually got a part out that offered AF.

That did happen but only at very long distances and at that time there weren't too many games around where it would show. That's a far cry to every game on the NV1x that looked blurry as hell.

Fire up Giants with 16x AF on a R100- that was a game from its era(HL certainly wasn't) or Sacrifice and check it out. They both suffered from serious issues with aliasing and disjointed filtering.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
1. I brought it up, because ATI didn't have the Radeon until very late. Therefore, NVX was competing against ATI's plenty of rage cards...
What the hell does this have to do about my comments about the Radeon? If you want to discuss the Rage that's your problem, not mine. If you can't respond on topic to what I posted then why even bother responding?

I am not sure what you consider small. Look at the 1600X1200 benchmark at this link
Forgot to link the second page in your own article, did we?

In addition dozens of benchmarks around the web (such as these, these and these show the performance difference is usually about 5%-15% which is miniscule compared to the enhanced IQ you get. In addition I've run dozens of my own benchmarks in a wide range of games and they confirm my statements.

More than likely your results are outliers and/or incorrect, especially based on the clueless commentary that comes after them. If a reviewer has trouble spotting the benefits of AF some serious questions need to be asked as to the competency of said reviewer.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
They certainly weren't doing anything close to real AF.
That's just nonsense Ben and you know it. All it takes is a quick glance of any whitepaper about the card.

Simply try changing the x, y and z axis at the same time and take a look- insane levels of aliasing.
Simply try changing the x, y and z axis at the same time and take a look - insane levels of texturing distortion aka blurriness from nVidia's trilinear.

Simply try changing the x, y and z axis at the same time and take a look - insane levels of banding and rainbow colours from nVidia's 16 bit S3TC, far worse than any straight 16 bit to 32 bit comparison. To quote yourself on the issue: "I still think of it as utterly hideous- have a hard time seeing how you wouldn't".

If you can't tolerate straight 16 bit colour then you must've been suicidal when you saw the S3TC artifacts.

Try anything with some decent textures?
Did you read the part where I stated the 6800U with optimizations shimmered more than the Radeon? If the Radeeon isn't doing anything close to real AF then by your standards you'd have to concede the 6800U would be doing what...point filtering perhaps?

It isn't nV v ATi- it is correct and incorrect.
Ah, so because trilinear is supposed to make textures distort and blur that's "correct", therefore it's fine?
Is nVidia's 16 bit S3TC also "correct"?

If so how can you criticize 16 bit colour "I still think of it as utterly hideous- have a hard time seeing how you wouldn't" when it's also "correct"?

Fire up Giants with 16x AF on a R100- that was a game from its era(HL certainly wasn't) or Sacrifice and check it out.
Fire up any game on an NV1x and you'll see massive texture distortion pretty much everywhere because 2xAF is next to useless. And no fix for the likes of UT S3TC until you hit the GF4 series three generations later.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
1. I brought it up, because ATI didn't have the Radeon until very late. Therefore, NVX was competing against ATI's plenty of rage cards...
What the hell does this have to do about my comments about the Radeon? If you want to discuss the Rage that's your problem, not mine. If you can't respond on topic to what I posted then why even bother responding?

I am not sure what you consider small. Look at the 1600X1200 benchmark at this link
Forgot to link the second page in your own article, did we?

In addition dozens of benchmarks around the web (such as these, these and these show the performance difference is usually about 5%-15% which is miniscule compared to the enhanced IQ you get. In addition I've run dozens of my own benchmarks in a wide range of games and they confirm my statements.

More than likely your results are outliers and/or incorrect, especially based on the clueless commentary that comes after them. If a reviewer has trouble spotting the benefits of AF some serious questions need to be asked as to the competency of said reviewer.



*shakes head*

Lets just turn to www.anandtech.com for results. But I am sure even you would discredit him. Benchmarks

FPS Without AF = 142.5

FPS With 16XAF = 88.8

I suppose you are going to find a way to *weasle* out of these results, you always do. You change the subject whenever someone proves you wrong, or you discredit them. You are a typical arrogant know-it-all prick. I bet you are going to say "Well that isn't the performance 16XAF" Well, guess what? You are the one talking about IQ, not me. You are the one that said it could do it without a performance hit.

Anand commands with the following P-AF Versus Q-AF

Immediately you see the performance benefits of ATI's adaptive anisotropic filtering algorithm; granted, it doesn't always make things look perfect but the pros definitely outweigh the cons as you can see by the results above.

In addition to it not always looking the best, this new "adaptive performance AF" didn't even exist in the Radeon 256 that you were talking about. Since it was introduced with the 8500 in a buggy form, you can't even try and take this out of context.

Point is, AF takes more horsepower than you can admit. The only reason you will not see it so much on "Today's" games is that we are CPU limited. But back when the Radeon 256 DDR was out, this was *not* the case. You will also find that my quote said the following
*Shudders* The ATI cards were so pathetically slow back then, you have to wonder how in the hell anyone could enable *any* anisotropic filtering on their cards. That is what is funny about the post... Yeah, they had the feature, but good luck ever getting use out of it when your hardware can barely run the latest games... ATI majorly sucked back then, until the Radeon series, they were the joke of the market.

They did, and you have a hard time admitting that.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I suppose you are going to find a way to *weasle* out of these results, you always do.
Catalyst 2.2, the first drivers ever made for the Radeon 9700 Pro; also the first ATi architecture to support trilinear AF too I might add.

Ever heard of driver updates? Or do you feel the Radeon 9700 Pro's 2002 launch drivers were perfectly optimized for trilinear AF running a brand new title like UT2003?

I also love it how you continually hand-pick results while ignoring the dozens of results that counter your ridiculous assertions. Even better is when results from your own links contradict your comments.

You are a typical arrogant know-it-all prick.
Well aren't you a clever little fellow. We all know you can type, now type something useful.

"Well that isn't the performance 16XAF" Well, guess what? You are the one talking about IQ, not me.
Yes, I am talking about IQ. And? Or more precisely, just what the hell are you talking about? Do you feel 16x performance AF doesn't provide a substantial IQ gain or something?

It's also worth noting that the original Radeon did exactly that: performance (bilinear) AF.

You are the one that said it could do it without a performance hit.
And I stand by the comment because it's correct. Your attempts to disprove me have been quite frankly ludicrous. I'm not even sure what you're trying to disprove.

In addition to it not always looking the best, this new "adaptive performance AF" didn't even exist in the Radeon 256 that you were talking about.
ROFL. Do you have any clue what you're talking about? ATi's AF has always been adaptive in that it didn't sample all surfaces, not to mention their first two implementations only did bilinear AF which gave those early cards the speed boost they needed to run the feature.

Point is, AF takes more horsepower than you can admit.
The point is I still have no clue as to what you're trying to say and I suspect you don't either.

They did, and you have a hard time admitting that.
Honestly, I'm wondering whether you're simply trolling or are really clueless.

The Rage series couldn't do AF.

Why do you keep bringing those cards up when absolutely nobody else was talking about them?

What the hell does the Rage series have to do with my original claim that the original Radeon could do 16xAF for very little performance hit?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
That's just nonsense Ben and you know it.

Not at all. You can pull up any sort of white paper you want- try it yourself.

Simply try changing the x, y and z axis at the same time and take a look - insane levels of banding and rainbow colours from nVidia's 16 bit S3TC

That is why I would shut it off. You also had the option of running S3TC3 which was clearly superior to ATi's S3TC1- but besides that all of them looked very bad. I spent more time trying to get a solution to that problem then anyone else I know of(well, maybe Wumpus). I was going back and forth between id and nV and eventually ended up hacking the Q3.exe to get it to run better for people that were willing to deal with the seriously reduced IQ brought on by S3TC(even when running in the modes that didn't look like complete sh!t).

If you can't tolerate straight 16 bit colour then you must've been suicidal when you saw the S3TC artifacts.

I spent the better part of six months trying to work out a solution, I may still have some of the email exchanges with Carmack kicking around from that actually. I just shut it off.

Did you read the part where I stated the 6800U with optimizations shimmered more than the Radeon?

Try rolling on z with the Radeon- I've never seen anything remotely close to that bad on an AF part.

If the Radeeon isn't doing anything close to real AF then by your standards you'd have to concede the 6800U would be doing what...point filtering perhaps?

Let me give you a slight clue- you will find nothing but me blasting nVidia for their filtering for the last several years. Go ahead and check for yourself. You assume that I am remotely close to as biased as you which I will point out yet again is far removed from reality. In fact, I have stated explicitly that I won't buy any of their latest part because of their inferior filtering.

Ah, so because trilinear is supposed to make textures distort and blur that's "correct", therefore it's fine?

Roll on z with the R100. It is directly comparable to using a -3 LOD bias with NO AF at all. Am I saying that no AF looked better- absolutely.

If so how can you criticize 16 bit colour "I still think of it as utterly hideous- have a hard time seeing how you wouldn't" when it's also "correct"?

16bit isn't correct for anything remotely recent. When you have to trunctuate color data before you can raster it is wrong- this is the reason why we now have 128bit color.

Fire up any game on an NV1x and you'll see massive texture distortion pretty much everywhere because 2xAF is next to useless.

And nothing resembling the eye gouging aliasing exhibited by the R100. Get away from FPSs once in a while and you may not sound so ignorant to the massive hardware limitations older parts have.

And no fix for the likes of UT S3TC until you hit the GF4 series three generations later.

And that makes ATi's hardware errors OK how....?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
I suppose you are going to find a way to *weasel* out of these results, you always do.
Catalyst 2.2, the first drivers ever made for the Radeon 9700 Pro; also the first ATi architecture to support trilinear AF too I might add.

Ever heard of driver updates? Or do you feel the Radeon 9700 Pro's 2002 launch drivers were perfectly optimized for trilinear AF running a brand new title like UT2003?

I also love it how you continually hand-pick results while ignoring the dozens of results that counter your ridiculous assertions. Even better is when results from your own links contradict your comments.

You are a typical arrogant know-it-all prick.
Well aren't you a clever little fellow. We all know you can type, now type something useful.

"Well that isn't the performance 16XAF" Well, guess what? You are the one talking about IQ, not me.
Yes, I am talking about IQ. And? Or more precisely, just what the hell are you talking about? Do you feel 16x performance AF doesn't provide a substantial IQ gain or something?

It's also worth noting that the original Radeon did exactly that: performance (bilinear) AF.

You are the one that said it could do it without a performance hit.
And I stand by the comment because it's correct. Your attempts to disprove me have been quite frankly ludicrous. I'm not even sure what you're trying to disprove.

In addition to it not always looking the best, this new "adaptive performance AF" didn't even exist in the Radeon 256 that you were talking about.
ROFL. Do you have any clue what you're talking about? ATi's AF has always been adaptive in that it didn't sample all surfaces, not to mention their first two implementations only did bilinear AF which gave those early cards the speed boost they needed to run the feature.

Point is, AF takes more horsepower than you can admit.
The point is I still have no clue as to what you're trying to say and I suspect you don't either.

They did, and you have a hard time admitting that.
Honestly, I'm wondering whether you're simply trolling or are really clueless.

The Rage series couldn't do AF.

Why do you keep bringing those cards up when absolutely nobody else was talking about them?

What the hell does the Rage series have to do with my original claim that the original Radeon could do 16xAF for very little performance hit?




Actually, you said "extremely small" and "free" to describe the performance hit from 16XAF. You quote the following;

Because the performance hit was extremely small. - BFG10K

I wasn't offended, your comment was simply nonsensical. I think you're getting confused with nVidia who didn't really get free AF until the FX series three generations later and didn't get 16xAF until the 6xxx series which came four generations later. - BFG10K

You make it plainly obvious that you do not understand how to use basic English words properly. You have no idea what "extreme" or "free" means... Go look it up in a dictionary.

You go on to state the following;

In addition dozens of benchmarks around the web (such as these, these and these show the performance difference is usually about 5%-15% which is miniscule compared to the enhanced IQ you get. In addition I've run dozens of my own benchmarks in a wide range of games and they confirm my statements.

1. You linked to the same site... Which A) has a horrible graph. B) can be just as biased as the next site. C) You linked to it, which gives it *little* credibility.

2. Why would I trust your benchmarks? Someone as uneducated as yourself claims that 5-15% loss is "extremely small" or "free". So, why would I trust your results? For all I know, you can not even count to 10.

3. Your 5-15% performance loss is not even accurate based on "Quality" 16XAF. The Performance loss is anywhere from 30-40%.

You are ignorant, go back to the sand box where you can be "king".

On a side note, I will not be responding to you anymore... Consider yourself /ignored. I did some searching and you are pretty much an ass to everyone, which is why I will not bother to respond to you again.

Say what you want,

/ignore



I found this on the forums
Wow, look at the troll who never sleeps. The 5800 rap is getting pretty old brudda. Get some new material. So tell us BFG, why do you keep visiting these threads? To stop misinformation? I see no misinformation here. Do you? Just go away. Did you read the Rage3d review? Did you find it interesting? No? Or are you just interested is trolling a legitimate topic. Consider yourself ignored.

ROFL! Have not laughed so hard in ages.

 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Where did all of the talk about the Kyro and PowerVR go?

I've owned a few different video cards using the PowerVR Kyro2 gpu from the Hercules 3dprophet 4500 to the Videologic VividXS Kyro 2, I also owned a Geforce MX at the same time and when using 32bit, higher resolutions or fsaa the Kyro2 used to win easily. Atleast in the UK the main competition for the Kyro2 at the time of it's release was the Geforce2 MX and as everyone knows the Kyro2 wiped the floor with it.

The final (I think) ever Kyro2 v geforce2 mx400 review was released 01/2004 and featured new games like unreal 2, enclave, max payne 2, ut2003 and x2 the threat etc. After the Kyro 2 se and kyro 3 were cancelled PowerVR decided to release new drivers which featured enhanced t&l for direct3d, this fixed alot of the compatibility problems to do with t&l. It was just software t&l but it allowed the Kyro2 to run alot of the games which used to not work at all. PowerVR Series5 is the gpu most people want to see and I have it on good authority that it does exist and has SM3.0! :)

Maybe the last Hercules 3D Prophet Kyro 2 review

Various PS3.0 and VS3.0 examples by PowerVR
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: BFG10K
I heard Kyro 3 offers full SM 3.0 support but more than likely the card itself is just a rumour.

The Kyro 3 (ie PowerVR Series 4) was never going to have SM3.0, PowerVR Series 5 is the one with full SM3.0 support and for various reasons no video cards have been produced so far. :(
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Not at all. You can pull up any sort of white paper you want- try it yourself.
It's certainly unacceptable these days but back then it was revolutionary. I did see certain games have the filter box but in general it was still far better looking than straight trilinear.

That is why I would shut it off.
That's fine but most reviewers didn't which meant Quake 3 based benchmarks weren't really accurate as nVidia usually had aound 10%-15% performance loss, in addition to texture thrashing in actual gameplay (32 MB boards mainly). Also if you shut it off in games like UT and MOHAA they wouldn't load large textures which made a huge negative impact on IQ.

You also had the option of running S3TC3 which was clearly superior to ATi's S3TC1
I concede DXT3 did have an edge over ATi's DXT1 but ATi's implementation still looked great as far as texture compression goes. Also nVidia took a performance hit when enabling this mode, though not as high as disabling it.

And DXT3 is not possible in UT because the textures are pre-compressed in DXT1 format.

Let me give you a slight clue- you will find nothing but me blasting nVidia for their filtering for the last several years.
This isn't about blasting nVidia, it's about refusing to accept when ATi could possibly have an advantage in something.

Roll on z with the R100.
I logged about a dozen hours on the card with Descent 3 when I was playing HL and this is one of the exact tests I tried. The card didn't appear to have any major problems with such an operation.

16bit isn't correct for anything remotely recent
That isn't what I mean by "correct". Nevermind, it's not worth pushing the issue.

Get away from FPSs once in a while and you may not sound so ignorant to the massive hardware limitations older parts have.
<shrugs>

What can I say, I evaluate cards mainly based on the games I play. The only other genre that would really benefit from AF would be flight sims but I can't see the card having major problems with those either as Descent 3 is one of the best AF stress-tests around.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
You make it plainly obvious that you do not understand how to use basic English words properly.
Reduced to playing semantic games now are we? If you educated yourself on the topic you'd know exactly what I mean. In this context "free" doesn't mean exactly 0%; what it means is the performance hit is small enough to justify the feature being left on at all times for the huge IQ gain it provides.

Which A) has a horrible graph.
Don't read the graph, scroll down the framerate table and you'll see a row calculating the exact performance hit of AF.

B) can be just as biased as the next site
I didn't claim your site was biased, I merely called into question the competency of a GPU reviewer who doesn't feel AF makes a difference to IQ. Somebody like that sounds like they're never played a game in their life.

Why would I trust your benchmarks?
Trust me, you have far greater issues to worry about than whether to trust me or not.

3. Your 5-15% performance loss is not even accurate based on "Quality" 16XAF. The Performance loss is anywhere from 30-40%.
Based on your graphs of a release board running a release game on release hardware then yes. But so what? Without even repeating myself about the possible issues of such a scenario the orginal Radeon did performance (bilinear) AF as well. Therefore your decision to ignore the performance benchmarks because "you were talking about IQ" (whatever the hell that means) is quite flawed.

On a side note, I will not be responding to you anymore...
Great, one less troll to deal with. The fewer the better.

ROFL! Have not laughed so hard in ages.
I'll bet you can easily laugh harder when you go back and read your comments in this thread.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
This isn't about blasting nVidia, it's about refusing to accept when ATi could possibly have an advantage in something.

Like when? When have I ever argued that the R1xx/Nv1x DXTC1 filtering comparison didn't go to ATi by a landslide?

When did I ever argue that ATi didn't have clearly superior shader performance in the R3x0/NV3x days?

When did I ever argue that ATi had clearly superior AA during that same era?

The difference between me and you is that you fall down and worship everything ATi does no matter how horrific it looks. A good example would be your comments on ATi's performance mode AF on the R3x0 parts- it looks hideously disgusting by any objective person- loaded with shimmering and mip transitions- and you talked about it like it was a godsend. nVidia currently has incredibly poor texture filtering- ATi always has.

I logged about a dozen hours on the card with Descent 3 when I was playing HL and this is one of the exact tests I tried. The card didn't appear to have any major problems with such an operation.

A true fanatic.

The only other genre that would really benefit from AF would be flight sims

3D RTSs/TBSs and Racers all have a considerably larger advantage with a quality AF implementation then any of those genres.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
When have I ever argued that the R1xx/Nv1x DXTC1 filtering comparison didn't go to ATi by a landslide?
You produced two arguments: (1) it was the spec and (2) nVidia users were telling you that UT's large textures looked good on the card.

That's great and all but neither of those address the real issue. Also you always somehow managed move the argument back to ATi's filtering.

When did I ever argue that ATi didn't have clearly superior shader performance in the R3x0/NV3x days?
I don't want to touch that one with a 20 foot pole. You can refresh your memory by browsing one of the many 30 page discussions on the subject in the past.

When did I ever argue that ATi had clearly superior AA during that same era?
I'll give you that one as I do remember you were making comments about Mafia's power lines looking good.

The difference between me and you is that you fall down and worship everything ATi does no matter how horrific it looks.
No, if ATi has driver issues or inferiorities I'm the first to jump over to Rage3D and complain about it. I just made a few posts yesterday in fact and you can see them in the Catalyst 5.8 thread.

A good example would be your comments on ATi's performance mode AF on the R3x0 parts- it looks hideously disgusting by any objective person-
Wrong, it looks hideously disgusting to you but that's because you've had a filtering vendetta against ATi since the dawn of time.

Having come from your own Messiah the NV25 and using it in actual gaming for 6+ months the IQ difference on the 9700 Pro wasn't significantly worse like you claim it was. However performance was about 3-4 times faster and I went from not being able to use any AF in the bulk of my games to using 16xAF for near-zero performance hit in all of my titles.

A true fanatic.
I assume you're referring the reduced samples? Well guess what, every card that does adaptive AF exhibits this issue which includes the likes of a 6800U. It's just one of the small prices to pay to get almost-free AF.

And I'm a fanatic? How about how you cling to your holy grail NV25?

Tell me, what games are you running on your Ti4600 at 8xAF these days Ben? 8xAF even back then was only usable in very old games like GLQuake or in incredibly CPU bound titles like Undying. In anything that even remotely touched the GPU like Quake 3 even 2xAF could cut the framerate in half. It was so bad nVidia started introducing optimizations in a desperate attempt to improve performance, optmizations which no doubt reduced the Messiah status of the card in your eys.

The NV25 did have the best filtering around no doubts there but lets not get delusional and claim it was even remotely usable. That filtering implementation could be classed as academic or even theoretical.

3D RTSs/TBSs
I can't see how isometric games would benefit from AF. You may have some grounds with Warcraft 3 but that's only because the cinematics tend to pan to first person view but actual gameplay is still isometric.

Descent 3 is by far the best stress-test for AF because it allows something no other game allows - the ability to move and look in any direction and any angle at any time.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You produced two arguments:

That was compared to you saying it was broken. It wasn't broken even if it was inferrior to ATi's solution at the time. I never argued that ATi's S3TC filtering wasn't superior or anything remotely resembling that.

I don't want to touch that one with a 20 foot pole. You can refresh your memory by browsing one of the many 30 page discussions on the subject in the past.

My memory is always fresh. I stated that by the time shader heavy games mattered the R3x0 would be far too slow to run them anyway negating it being a major factor in purchasing a R3x0 based part. I was right.

Wrong, it looks hideously disgusting to you but that's because you've had a filtering vendetta against ATi since the dawn of time.

That is because they have always done it wrong. If you were even handed you would be defending the god awful crap nV is calling filtering now- but you aren't.

Having come from your own Messiah the NV25 and using it in actual gaming for 6+ months the IQ difference on the 9700 Pro wasn't significantly worse like you claim it was.

This coming from a guy who was talking about how great the bilinear filtering on the R3x0 was- your comment is predictable and inaccurate. On a mathematical basis, in screenshots and particularly in motion ATi's inferior filtering is extremely visible to anyone. I was flat out shocked at just how bad it was after hearing you talking about it.

I assume you're referring the reduced samples? Well guess what, every card that does adaptive AF exhibits this issue which includes the likes of a 6800U.

And they all suck. Why do you have such a hard time understanding this? You going to start raving about ATi's temporal AA next, it is akin to the crap they call filteirng.

It was so bad nVidia started introducing optimizations in a desperate attempt to improve performance, optmizations which no doubt reduced the Messiah status of the card in your eys.

They optimized memory access. In terms of it being 'the messiah'- they are the only major player who has ever done it right- this is provable on a mathematical basis. It is a point of fact if you want to admit it or not.

I can't see how isometric games would benefit from AF.

And what in the world does that have to do with that I stated....?

You may have some grounds with Warcraft 3 but that's only because the cinematics tend to pan to first person view but actual gameplay is still isometric.

Do you ever even take a quick glance at genres outside of FPSs? Fire up Rome Total War as a quick, and popular, example and see how much AF impacts actual gameplay. Isometric.... you been in a game shop in the last ten years?

Descent 3 is by far the best stress-test for AF because it allows something no other game allows

You seriously need to get to a game shop- there are literally hundreds of games where you can do the same- Descent3 is only known about because it was a bench tool for a while.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
I think several people need banned for trolling blindly.

THIS IS A "KYRO" THREAD. NOT AN ATI VS. NVIDIA THREAD.

Geez guys, flame in an SM 2.0 Vs. SM 3.0 thread if you want to get your kicks. Take your petty arguements elsewhere because quite frankly, I dont give a damn.

Some interesting info Nemesis2k, Its a pity they havent released a card into the mainstream market. Adding more companies to the mix would keep the fun.

I think I might join Vegitto. These forums are getting are getting like THG's reviews, just ridiculous at times.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BenSkywalker
You produced two arguments:

That was compared to you saying it was broken. It wasn't broken even if it was inferrior to ATi's solution at the time. I never argued that ATi's S3TC filtering wasn't superior or anything remotely resembling that.

I don't want to touch that one with a 20 foot pole. You can refresh your memory by browsing one of the many 30 page discussions on the subject in the past.

My memory is always fresh. I stated that by the time shader heavy games mattered the R3x0 would be far too slow to run them anyway negating it being a major factor in purchasing a R3x0 based part. I was right.

Wrong, it looks hideously disgusting to you but that's because you've had a filtering vendetta against ATi since the dawn of time.

That is because they have always done it wrong. If you were even handed you would be defending the god awful crap nV is calling filtering now- but you aren't.

Having come from your own Messiah the NV25 and using it in actual gaming for 6+ months the IQ difference on the 9700 Pro wasn't significantly worse like you claim it was.

This coming from a guy who was talking about how great the bilinear filtering on the R3x0 was- your comment is predictable and inaccurate. On a mathematical basis, in screenshots and particularly in motion ATi's inferior filtering is extremely visible to anyone. I was flat out shocked at just how bad it was after hearing you talking about it.

I assume you're referring the reduced samples? Well guess what, every card that does adaptive AF exhibits this issue which includes the likes of a 6800U.

And they all suck. Why do you have such a hard time understanding this? You going to start raving about ATi's temporal AA next, it is akin to the crap they call filteirng.

It was so bad nVidia started introducing optimizations in a desperate attempt to improve performance, optmizations which no doubt reduced the Messiah status of the card in your eys.

They optimized memory access. In terms of it being 'the messiah'- they are the only major player who has ever done it right- this is provable on a mathematical basis. It is a point of fact if you want to admit it or not.

I can't see how isometric games would benefit from AF.

And what in the world does that have to do with that I stated....?

You may have some grounds with Warcraft 3 but that's only because the cinematics tend to pan to first person view but actual gameplay is still isometric.

Do you ever even take a quick glance at genres outside of FPSs? Fire up Rome Total War as a quick, and popular, example and see how much AF impacts actual gameplay. Isometric.... you been in a game shop in the last ten years?

Descent 3 is by far the best stress-test for AF because it allows something no other game allows

You seriously need to get to a game shop- there are literally hundreds of games where you can do the same- Descent3 is only known about because it was a bench tool for a while.

/pwned

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It wasn't broken even if it was inferrior to ATi's solution at the time
Your rational behind why it wasn't broken was because it met the spec. Okay then, but then you claim ATi's AF is broken even though AF has no spec.

I stated that by the time shader heavy games mattered the R3x0 would be far too slow to run them anyway negating it being a major factor in purchasing a R3x0 based part. I was right.
That part is up for discussion (e.g Far Cry being playable at 1024x768 on R3xx but not on FX unless SM 1.1 was used) but in any case it wasn't the main point of your argument. You basically claimed nVidia's shaders didn't really have a problem but it was the likes of Gabe and FutureMark involved in a giant conspiracy to make them look bad.

If you were even handed you would be defending the god awful crap nV is calling filtering now- but you aren't.
Sorry? Where have I ever bashed nVidia for optimizing their filtering? The only thing I've really said was that they tend to shimmer more when all optimizations are enabled but of course I was quick to point out you can disable them (something I praise them on actually). I've never had a problem with optimizing filtering unless it's done in a sneaky or covert fashion (e.g. detecting UT2003.exe or detecting specific benchmarks).

On a mathematical basis, in screenshots and particularly in motion ATi's inferior filtering is extremely visible to anyone.
Sure, and on mathematical basis pre-rendering is superior to realtime rendering so are you going to stop playing games and just watch movies and cinematics instead?

And again I'll ask which games you're playing right now and utilizing nVidia's "gift-from-the-gods" 8xAF? Tell me, if this scheme is so viable why has it gone the way of the Dodo?

And they all suck. Why do you have such a hard time understanding this?
That doesn't leave you with a lot of options then, huh? Either disable AF and go back to 1999 rendering quality, or stop playing games completely.

You going to start raving about ATi's temporal AA next, it is akin to the crap they call filteirng.
I challenge you to find any post where I praised this feature. I've slammed it multiple times and requested the likes of SSAA instead. T(emporal)AA is nothing more than a useless gimmick.

They optimized memory access.
Hahahaha, that's a nice spin you put on it. I call it "reducing/disabling texture samples depending on the filtering stage" because that's exactly what they were doing, and incidentally adding some shimmer depending on the settings you used. I was one of the first to start testing Unwinder's patch scripts and testing for both IQ and performance differences when nVidia first exposed the functionality through their drivers.

there are literally hundreds of games where you can do the same- Descent3 is only known about because it was a bench tool for a while.
That's just utter nonsense. Name one other game where you can do something like using the left strafe key to go up (i.e towards the roof/sky). Or how about rotating about x, y, z at the same time while moving backwards and upwards at a 45 degree angle?

All child's play in the Descent series but basically impossible in any other game. Hell, most free-flight games like flight sims can't even do something as simple as flying backwards on the z axis.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
This thread really makes me happy. Seeing all the old-timers arguing about the old cards. Brings back fond memories of me and like three other people dilligantly defending the Voodoo5 against the hordes of GTS fans.