ATi to demo DX11 chip tomorrow in Computex

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BenSkywalker
I used AF on the Ti all the time, AF first, AA if I had performance available. Certainly it was much slower then the R9500Pro- I actually traded that board off as the drivers and IQ were so poor that's how I got the Ti- but by the time the R9800Pro was hitting the Ti was just far too slow to run current games with AF at all anymore.

Yea, I guess that was the point. With no AA/AF, the Ti was okay performance-wise, compared to the 9700/9800... But as soon as you turned AA/AF on, the performance dropped greatly. R300 didn't have that problem, so it could be used on current games, not just as a feature to improve the quality of ancient games.

In fact, before R300, I rarely even saw AA/AF options offered in a game's graphics settings. The only way to use it was to force it through the control panel. The first card with AA/AF that I had was a GeForce2 GTS, I believe... but I was happy enough to get trilinear texture filtering on that thing at the time... That was the only image-enhancing feature that was fast enough to use in actual games. Then I had a Radeon 8500... Don't think I ever used AA/AF there either, just wasn't fast enough. Then on the 9600XT it suddenly was very cheap to turn all the features on.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Scali

Since I use the subjective qualifier 'much' in my comment, the accuracy of the comment cannot be determined in the first place.
If the accuracy of the comment cannot be determined, why did you state it as fact? Additionally, I can only assume we?re speaking English here?

http://dictionary.reference.com/browse/much

?adjective
1. great in quantity, measure, or degree:
The examples I gave improved image quality in a great degree, especially AAA in OpenGL. So again, going by English dictionary definitions, you?re wrong.

Now, if you choose not to use English definitions as the basis of your comments, then you need to give us prior warning.

Other than that, they added extra AA modes, which required more processing, and as such were not very practical.
How much processing they require is irrelevant to your original false claim that AA didn?t improve much.

And actually they?re very practical given I used them a lot when I gamed on the 4850. Additionally, I?ve thoroughly tested and benchmarked them in dozens of titles.

How much gaming have you done with those modes, Scali?
How much benchmarking/testing have you done with those modes, Scali?

You them pull the AA out of context and go on some off-topic egotrip AGAIN.
What context? You stated they hadn?t improved AF/AA much, and I pointed out AA had improved it a lot. So please, explain to us this mysterious missing context of your comment.

When you say ?context? I think you actually mean ?I?ve been caught out stating inaccurate statements, so now I?m back-pedaling and playing semantic games?.

Why don?t you look at graphics card marketshare? Hint: GMA comes ahead by a long shot. So again, if our discussions are based on what most people do, most people use GMA.

I'm talking about the fact that early hardware took a bruteforce approach.
But the 16xAF parts didn?t, so your statement is the opposite of fact. Early 16xAF parts did even less work than the R3xx, which in turn did less than today?s ATi?s parts. So your statement ?if you used 16xAF, it actually took 16 samples for every pixel. is woefully inaccurate.

You mean that in practical implementations it is usually 64 or 128 TAPS, which is not exactly a given, it's hardware/driver dependent. It certainly doesn't MEAN that it will do 64/128 taps.
Like I said ?minus any optimizations for a given surface?

16x MEANS that your texels may have a max anisotropic level of 16x after being projected on screen, namely the width:height ratio is 16:1 (or 1:16, depending on the orientation).
Err, you didn?t state any of that. You stated:

The algorithm on early AF hardware was just bruteforce, so if you used 16xAF, it actually took 16 samples for every pixel. Sure, it gives perfect results, but it's not a practical solution becasue it takes far too much bandwidth.
You stated it used 16 samples on every pixel, which is wrong on both counts. Your comments are right there in bold. Are you now denying you said it?

16x isn?t even a sample count, it?s a ratio. Yet you stated it was the sample count. Again it?s right there in bold. Are you denying the contents of the quote?

Another total back-pedal on your part, yet again.

If you want to try and look smart, at least get your facts straight. Now you just look like an idiot trying to correct someone with the WRONG info, LMAO.
Heh.

?GF3 didn?t have MSAA?.
-Scali, 2009.

I got a good laugh from that one. Thanks for that. ;)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
If the accuracy of the comment cannot be determined, why did you state it as fact?

If I say "Today it's much warmer than yesterday", it's accurate as long as temperature(today) > temperature(yesterday), since 'much' can mean anything from 0 and up. "great in quantity", yea, but what is "great", and what is the "quantity"? 'much' is subjective.
Now you're trying to argue that I can only say that if temperature(today) > (temperature(yesterday) + some arbitrary value).
Just because YOUR 'some arbitrary value' doesn't meet my perception of 'much' doesn't mean that you can argue about 'facts'. It's still subjective.

Originally posted by: BFG10K
The examples I gave improved image quality in a great degree, especially AAA in OpenGL. So again, going by English dictionary definitions, you?re wrong.

Again, very subjective. What is a "great degree"? What is the context we're talking about here?
I believe it was Idontcare who recently said that you have to specify such a context before you can argue the actual point. If you can't agree on the context and the point that is being argued, then it's not possible to have an argument in the first place.

Originally posted by: BFG10K
How much processing they require is irrelevant to your original false claim that AA didn?t improve much.

Again, it depends on the context.
I was talking about improvements on the existing modes.

Originally posted by: BFG10K
What context? You stated they hadn?t improved AF/AA much, and I pointed out AA had improved it a lot. So please, explain to us this mysterious missing context of your comment.

I took AA/AF as a single combined entity.

Originally posted by: BFG10K
When you say ?context? I think you actually mean ?I?ve been caught out stating inaccurate statements, so now I?m back-pedaling and playing semantic games?.

You're the one playing semantic games, trying to turn a subjective observation into an argument about 'facts'.

Originally posted by: BFG10K
Why don?t you look at graphics card marketshare? Hint: GMA comes ahead by a long shot. So again, if our discussions are based on what most people do, most people use GMA.

Again, I don't see why you're being argumentative over things like these.

Originally posted by: BFG10K
Like I said ?minus any optimizations for a given surface?

I think you missed my point.
There is no direct relation between 16x max anisotropy and the hardware performing 64 or 128 taps, optimizations or no optimizations.
In no way does setting max anisotropy to 16 in D3D or OpenGL dictate anything to the hardware related to the number of taps. It just tells the driver the maximum amount of anisotropy it is expected to compensate for. This can be done in a number of ways with any number of taps and prefiltered textures (MIP maps, RIP maps etc). That is not specified by the API, and is up to the IHV.

Originally posted by: BFG10K
?GF3 didn?t have MSAA?.
-Scali, 2009.

I got a good laugh from that one. Thanks for that. ;)

I'm only human, I never claimed to be infallible. I admitted my mistake already. I don't see why you don't just move on. You seem to have an ego-problem.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: BFG10K

" especially the newer ones that exhibit shader aliasing."


" especially if the game has a lot of shader or texture aliasing".

Can AF help with Shader Aliasing?

Can MSAA or SSAA help with Shader Aliasing?

Shader Aliasing has been ocurring in ages in games and yet the only solution found was to use some algorithms to make shader anti aliasing to shaders only which is a bit costly in the hardware side, specially for DX9. If you know more about the answers of the questions above enlight me :) I like to learn.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: evolucion8
Can AF help with Shader Aliasing?

Yes and no.
That is, technically any kind of texture aliasing will show after processing with shaders, so AF filtering textures will reduce aliasing problems.
But that is generally not what people refer to with 'shader aliasing', so in that sense: no.

Originally posted by: evolucion8
Can MSAA or SSAA help with Shader Aliasing?

SSAA can, MSAA cannot.
MSAA was invented specifically to only have to run a shader once for all subpixels, saving valuable bandwidth and processing resources. This works fine when you can catch all the aliasing with texture filtering. But that is not always the case with per-pixel lighting (eg bumpmapping and such).
So if you go back to running the shader for all subpixels and filtering the results, you're effectively doing SSAA again, but you will be taking care of the aliasing caused by the shaders. It's expensive, but eventually we will probably be going back to this solution, as it's the best way to handle this type of aliasing.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I knew that MSAA wouldn't work, specially that it only targets the Z-Buffer. But in the end, once faster cards reaches the market, SSAA will make a come back again to be usable.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: evolucion8
I knew that MSAA wouldn't work, specially that it only targets the Z-Buffer. But in the end, once faster cards reaches the market, SSAA will make a come back again to be usable.

Yea, nVidia has been hinting at SSAA for years already.
Hybrid modes are also possible. Eg instead of 4xMSAA with 4 depth samples and 1 colour sample, you could also do 4 depth samples and 2 colour samples.
This will catch some of the shader aliasing at a relatively small performance penalty. In most cases you probably need more antialiasing at the edges anyway (sharper contrast than within a polygon, generally, so more obvious aliasing).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Scali

I'm only human, I never claimed to be infallible. I admitted my mistake already. I don't see why you don't just move on. You seem to have an ego-problem.
Fair enough then. In the interests of remaining civil, I will drop our current argument.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
Fair enough then. In the interests of remaining civil, I will drop our current argument.

It's a bit too late for that, I'm afraid.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Scali

It's a bit too late for that, I'm afraid.
I?m not sure what you?re trying to achieve with further jabs. I thought you wanted to end this, so I?m happy to initiate cessation. If you want to continue, just say the word.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: evolucion8

Can AF help with Shader Aliasing?
Yes, a good AF scheme reduces shader aliasing; this is why ATi?s parts exhibit more shimmer than nVidia?s.

But it?s not as good as super-sampling. Look at the Doom 3 screenshots I took in the ABT review you linked. Even with the best AF in consumer space, there?s still shader aliasing that?s substantially reduced by super-sampling.

Can MSAA or SSAA help with Shader Aliasing?
MSAA can?t because it only ever takes one shader sample. Super-sampling can because it takes multiple shader samples, either by rendering the shader bigger and scaling down, or by rendering the shader multiple times.

Shader Aliasing has been ocurring in ages in games and yet the only solution found was to use some algorithms to make shader anti aliasing to shaders only which is a bit costly in the hardware side, specially for DX9.
I vaguely remember something about DX10.1 offering the ability to AA shaders only, but it requires developer support. In essence it?s like TrAA/AAA, but for shaders.

Presumably the point of it was to be faster and more efficient than brute force SS, possibly by allowing the developer to target specific shaded areas only.

But in the end, once faster cards reaches the market, SSAA will make a come back again to be usable.
It?s usable now in slightly older games (many 2005 or older titles on a single GTX285), but unfortunately neither vendor is focusing their efforts on it. But at least nVidia unofficially offers basic OGSS.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
I?m not sure what you?re trying to achieve with further jabs. I thought you wanted to end this, so I?m happy to initiate cessation. If you want to continue, just say the word.

Just a simple observation... In order to *remain* civil, you need to have stayed on the civil side all the time, which isn't the case. Hence it's a bit too late to 'remain' civil now.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
I vaguely remember something about DX10.1 offering the ability to AA shaders only, but it requires developer support. In essence it?s like TrAA/AAA, but for shaders.

Presumably the point of it was to be faster and more efficient than brute force SS, possibly by allowing the developer to target specific shaded areas only.

DX10.1 allows you to read back the supersampled depthbuffer, and get the sample positions, and run a pixelshader at sample-frequency rather than pixel-frequency.
It's especially useful because it allows you to use MSAA on deferred rendering.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
I was very upset when NVIDIA, because ATI decided to stick with their approximation, decided to step backwards and use a lower quality AF method for NV40 ... We pushed them many times to ignore the poor quality ATI chose to implement and deal with the performance hit because gamers would appreciate it.

Here's the move from GeForce FX to the GeForce 6 series.

http://www.anandtech.com/showdoc.aspx?i=2023&p=8

And here I mentioned the move back to GeForce FX quality AF in our G80 coverage.

http://www.anandtech.com/video/showdoc.aspx?i=2870&p=14

::EDIT:: -- because I was late to the conversation :)