ATi to demo DX11 chip tomorrow in Computex

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Dec 30, 2004
12,553
2
76
Originally posted by: lopri
Originally posted by: BFG10K
I really want to see what this thing brings to the table in terms of AF and AA.
2nd'ed.

This is the only step forward we can get now with graphics yes?
The 4890 can play all the latest games highest settings at 1920x1200.

No new games on the horizon to bring it to its knees......
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: soccerballtux
Originally posted by: lopri
Originally posted by: BFG10K
I really want to see what this thing brings to the table in terms of AF and AA.
2nd'ed.

This is the only step forward we can get now with graphics yes?
The 4890 can play all the latest games highest settings at 1920x1200.

No new games on the horizon to bring it to its knees......

I think they're referring to the image quality rather than performance.
ATi hasn't improved its AF/AA quality much since the Radeon X1000-series. nVidia has slightly better image quality (at least, it is closer to the theoretical ideal).
 

ibex333

Diamond Member
Mar 26, 2005
4,094
123
106
Well this is all good and well, but why even make DX11 when they never even fully took advantage of DX10?
 
Dec 30, 2004
12,553
2
76
Originally posted by: Scali
Originally posted by: soccerballtux
Originally posted by: lopri
Originally posted by: BFG10K
I really want to see what this thing brings to the table in terms of AF and AA.
2nd'ed.

This is the only step forward we can get now with graphics yes?
The 4890 can play all the latest games highest settings at 1920x1200.

No new games on the horizon to bring it to its knees......

I think they're referring to the image quality rather than performance.
ATi hasn't improved its AF/AA quality much since the Radeon X1000-series. nVidia has slightly better image quality (at least, it is closer to the theoretical ideal).

Who cares how AA is if you can't even see the graphics? Back when I was playing WoW, I took this screenshot. I haven't seen any ATI shots that looks like this.

Fast forward to my 8800GT and the colors were NO better. This was after a reformat and complete reinstall, different driver versions, no color profile tweaking, etc. Simply horrible.
Next card will be ATI, only grabbed the 8800GT/9800GT because of a rebate + Cod4+ great price.

Don't play wow anymore either, but still, ATi has always had the better IQ from everything I've seen. You'll have to do more; please give links. There are enough paid Nvidia forum folks that we can't just try any person's word anymore...
So yeah, if you have any IQ comparisons that would be great.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: ibex333
Well this is all good and well, but why even make DX11 when they never even fully took advantage of DX10?

Sure they have. DX11 is easier to develop for and brings with it some very important new features. ATI even said that their current DX10.1 cards will likely run faster just by having DX11 installed and the appropriate drivers due to compute shaders and multithreaded drivers, something lacking in DX10. So there is a benefit even if you are not running DX11 games.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: soccerballtux

This is the only step forward we can get now with graphics yes?
It?s not the only step, but it?s definitely an important one, especially given ATi?s lackluster AF.

The 4890 can play all the latest games highest settings at 1920x1200.
It can play them, but are they playable? Not the likes of Crysis, Stalker Clear Sky, or Cryostasis that?s for sure.

No new games on the horizon to bring it to its knees......
Those games are already here.

Who cares how AA is if you can't even see the graphics? Back when I was playing WoW, I took this screenshot. I haven't seen any ATI shots that looks like this.
I?m not sure what you?re saying exactly. That because you can?t see the graphics in WoW, that makes AA & AF useless? How about just raising the brightness of the game?

Don't play wow anymore either, but still, ATi has always had the better IQ from everything I've seen.
They?re slightly better with edge AA, and offer AAA in OpenGL.

But nVidia has much better AF, has full scene super-sampling, and has more accurate TrAA in Direct3D.

As a result, nVidia is better overall.

You'll have to do more; please give links.
http://forums.anandtech.com/me...id=31&threadid=2276048

So yeah, if you have any IQ comparisons that would be great.
Done. :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Scali

ATi hasn't improved its AF/AA quality much since the Radeon X1000-series.
That?s false; the 2000 series introduced edge-detect modes and 8xMSAA, and OpenGL AAA arrived a bit later.

nVidia has slightly better image quality (at least, it is closer to the theoretical ideal).
Actually nVidia?s IQ advantage is quite large with AF and super-sampling.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
While nVidia theorically has the best implementation of AF in the market, the visual difference in games are almost 100% imposible to find unless if you are looking with a microscope or examining carefully and zooming picture shots of the game. Anti Aliasing in overall, is a wash depending of the scenario. Traa accuracy is a matter of taste, someone may like ATi's sharper look, others would like nVidia's soft edges, I'd prefer the nVidia's Softer Edges. ATi has better default color profile and that's why they look so vivid, but with some tweaks nVidia should match it with no problems.

ATi's edge detect is the best Anti Aliasing implementation currently which offers notable differences in image quality compared to other modes, but it's usability is questionable like nVidia's Super Sampling modes since both are pretty demanding in power. I played Velvet Assassin with Narrow and Wide tent filters and the blurryness is very hard to spot, but in Crysis, looked so blurry that I actually thought that I had miopia or bleach poured in my eyes. While nVidia is behind with AAA support in OpenGL, I don't see the rush considering that there are so many few games currently in the market.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
That?s false; the 2000 series introduced edge-detect modes and 8xMSAA, and OpenGL AAA arrived a bit later.

They added more features. That isn't equivalent to improving the quality 'much' as I said.
But yes, their biggest problem is with AF. I just didn't think it was necessary to go into detail. Apparently you do :)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: ibex333
Well this is all good and well, but why even make DX11 when they never even fully took advantage of DX10?

Why make quadcores when they never even fully took advantage of dualcores?
Software has to gradually adapt, you can't expect software to support all the latest features overnight. Doesn't mean you should stop adding features to your hardware, because then the evolution of hardware will come to a grinding halt.

Besides, for developers DX11 is very similar to DX10, so they'll get a headstart in DX11 this time, where they had to start from scratch when going from DX9 to DX10.
You can reuse most of your code now, you just get extra features to use.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: soccerballtux
Don't play wow anymore either, but still, ATi has always had the better IQ from everything I've seen. You'll have to do more; please give links. There are enough paid Nvidia forum folks that we can't just try any person's word anymore...
So yeah, if you have any IQ comparisons that would be great.

That's funny actually. IQ comparison links have been posted various times on this and other forums. I thought it was common knowledge that nVidia indeed has a near-perfect AF, while ATi still shows that some angles are considerably worse than others.

On the other hand, you are the first person to complain about wrong colours on nVidia hardware, and going about it as if it's some kind of structural problem.

Then you talk about paid nVidia forum folks? :)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: evolucion8
While nVidia theorically has the best implementation of AF in the market, the visual difference in games are almost 100% imposible to find unless if you are looking with a microscope or examining carefully and zooming picture shots of the game.

Yes, I agree that it's hard to notice... But on the other hand, nVidia has had this quality since the G80, which is nearly 3 years old now. ATi has released no less than three generations of hardware since the G80, and STILL hasn't caught up with that level of quality. There's just something about that that doesn't feel right, from an enthusiast point-of-view.
Also, it probably gives ATi an unfair advantage in benchmarks, because they are effectively doing less work, saving on bandwidth and all that, by not filtering as accurately as nVidia does.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Scali
Originally posted by: evolucion8
While nVidia theorically has the best implementation of AF in the market, the visual difference in games are almost 100% imposible to find unless if you are looking with a microscope or examining carefully and zooming picture shots of the game.

Yes, I agree that it's hard to notice... But on the other hand, nVidia has had this quality since the G80, which is nearly 3 years old now. ATi has released no less than three generations of hardware since the G80, and STILL hasn't caught up with that level of quality. There's just something about that that doesn't feel right, from an enthusiast point-of-view.
Also, it probably gives ATi an unfair advantage in benchmarks, because they are effectively doing less work, saving on bandwidth and all that, by not filtering as accurately as nVidia does.

I agree with you, Scali, and my guess is that the AMD parts can perform quality AF, but the performance hit would be embarassing for AMD.

ATi always had the best AF when the 9700/9800 cards were around. Their AA/AF went out the window when the 1800XT was released. That series of cards did more to harm ATi's brand than anything I have seen them come out with.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: SickBeast
I agree with you, Scali, and my guess is that the AMD parts can perform quality AF, but the performance hit would be embarassing for AMD.

No hardware since 2002 gets a huge performance hit by enabling AF no matter if it's angle independent or not. When the X1800XT debuted, enabling HQ AF had a less of a 5% of a performance hit difference compared to Standard AF and the image quality difference was outstanding. Today, the AF image quality difference between G80 and R6X0 is smaller than that, and both cards are more powerful. So you are just speculating.

ATi always had the best AF when the 9700/9800 cards were around. Their AA/AF went out the window when the 1800XT was released. That series of cards did more to harm ATi's brand than anything I have seen them come out with.

Wrong, the GeForce FX had the best AF quality at that time compared to the Radeon 9700 series and it was usable in DX8 games, because we all know the GeForce FX DX9 sucky performance. ATi always had the best AA quality since the R300 era, even today

http://enthusiast.hardocp.com/...w0LCxoZW50aHVzaWFzdA==

In the first screenshot of the power lines it seems that the Radeon HD 4870?s 4X AA looks better than the GeForce GTX 260?s 4X AA. Looking at the other two screenshots though we can?t see any difference between 2X and 4X AA in normal view.

In Crysis, we really can?t see any differences in AA image quality at 2X or 4X. However, in the first screenshot of the shack look to the right of the screen, to the roof of the shack in the background. The roof looks slightly more detailed on the ATI hardware, than the NVIDIA hardware. We don?t really know what this means or which image is supposed to be right, it?s just a difference we noticed

Here in Crysis the first screenshot shows hardly any difference notable between 8X CSAA and 8xQ MSAA in normal view. Looking at the second screenshot though we can see some clear differences, proving that 8xQ MSAA is higher quality than 8X CSAA. Looking at the cables that are right underneath the crane, we can see that with 8X CSAA they are more broken up and blocky than they are using 8xQ MSAA. In this image 8xQ MSAA matches up perfectly with ATI?s 8X MSAA.

http://enthusiast.hardocp.com/...w1LCxoZW50aHVzaWFzdA==

To us, looking very closely at these image quality screenshots, it seems that ATI?s 12X CFAA actually looks slightly better than NVIDIA?s 16X CSAA. If you look closely at the power lines there seems to be better color blending and a better gradient with 12X CFAA. It is harder to tell with the second and third screenshots however. Looking closely at 24X CFAA we also find it to be slightly better than 16xQ CSAA on the power lines, but equally as nice as 16xQ on the other images

We again see the same pattern follow here. Look closely at the cables under the crane, both 12X CFAA and 24X CFAA look better there in comparison to 8X CSAA and 16X CSAA.

To us, again, looking closely, it seems that 8X Adaptive AA looks better than 8X CS TR SSAA, but in normal view hard to tell. That tree in the first screenshot just doesn?t look as good even at 16X CS TR SSAA as it does at ATI?s 8X AD AA. The thing is though, we really have to look closely to see these kinds of differences, in normal view it is hard to see a difference.

http://alienbabeltech.com/main/?p=3188&all=1 <<Read it, it may enlight you.

. The only brand of cards that I can recall that did harm to the Radeon brand was the HD 2900XT and the 8500 series.

Originally posted by: Scali
Yes, I agree that it's hard to notice... But on the other hand, nVidia has had this quality since the G80, which is nearly 3 years old now. ATi has released no less than three generations of hardware since the G80, and STILL hasn't caught up with that level of quality. There's just something about that that doesn't feel right, from an enthusiast point-of-view.
Also, it probably gives ATi an unfair advantage in benchmarks, because they are effectively doing less work, saving on bandwidth and all that, by not filtering as accurately as nVidia does.

ATi at least should consider having it's current AF as Standard AF and give us a HQ checkbox to bring Near Perfect AF as an option like in the X1K era. In benchmarks, while it's true what you wrote, nVidia has more texturing and filtering power, it shouldn't be that unfair.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: SickBeast
I agree with you, Scali, and my guess is that the AMD parts can perform quality AF, but the performance hit would be embarassing for AMD.

ATi always had the best AF when the 9700/9800 cards were around.

Yea, the 9700/9800 series was revolutionary. Cards had AA and AF before, but for the first time, the performance hit was small enough to actually USE AA and AF on nearly all games.
I had a 9600XT at the time, and I have to admit, I always used Catalyst AI at the time, because I thought it was a good trade-off between performance and quality. Thing is, I had a choice at the time.

Going back to AA... yes, modern cards may have more than just the 2x, 4x, 6xMSAA that you had with the 9700. But how many people actually use those? I've never used anything beyond 4xAA, because the higher modes just cost too much performance, and it's not worth the gain in image quality.
So in that sense, I personally at least haven't really moved from the 4xAA/16xAF settings that I started to use with my 9600XT at the time. I don't feel like there's been a revolution in image quality since.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: evolucion8
ATi at least should consider having it's current AF as Standard AF and give us a HQ checkbox to bring Near Perfect AF as an option like in the X1K era.

Agreed.

Originally posted by: evolucion8
In benchmarks, while it's true what you wrote, nVidia has more texturing and filtering power, it shouldn't be that unfair.

Well, isn't that the point of the unfairness? nVidia has more power, but because ATi does a simpler type of texture filtering, nVidia isn't faster in benchmarks, despite the extra power.
But that's been the problem in graphics for years. You can never really do a 1:1 comparison between different hardware.
That in itself isn't really a problem... But the way hardware is reviewed, it often isn't made clear that the comparisons aren't 1:1, and the focus is only on the performance of the cards. I would like to see more in-depth reviews with a more thorough analysis of the hardware and image quality.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Who cares how AA is if you can't even see the graphics? Back when I was playing WoW, I took this screenshot. I haven't seen any ATI shots that looks like this.

What particularly is horribly wrong with that picture you linked? I've asked you this before, all I can recall is you saying it was dark, but that AB is being played at night? I'm left wondering if you have something horribly improperly calibrated honestly.

While nVidia theorically has the best implementation of AF in the market, the visual difference in games are almost 100% imposible to find unless if you are looking with a microscope or examining carefully and zooming picture shots of the game.

It is painfully obvious in an instant for anyone with anything resembling a decent display and vision. Some people may not care much, but it is 2x4 between the eyes obvious.

When the X1800XT debuted, enabling HQ AF had a less of a 5% of a performance hit difference compared to Standard AF and the image quality difference was outstanding.

Really? With what drivers? I have an 1800xtx sitting here and the AF quality is very poor compared to integrated nV graphics running off a mobo.

ATi always had the best AF when the 9700/9800 cards were around.

The nV2x parts utterly obliterated the R300 cores in terms of AF quality, it really wasn't remotely close. The nV2x parts were the last we have seen that did full, proper AF. The newer nV cards have gotten extremely good at approximation, but they still can't output the same pixel perfect AF the nV2x parts did. Downside was the nV2x took a catastropic performance hit while doing it. Still have a R9800Pro and a Ti4200 in old machines here too, the situation hasn't changed.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BenSkywalker
The nV2x parts utterly obliterated the R300 cores in terms of AF quality, it really wasn't remotely close. The nV2x parts were the last we have seen that did full, proper AF. The newer nV cards have gotten extremely good at approximation, but they still can't output the same pixel perfect AF the nV2x parts did. Downside was the nV2x took a catastropic performance hit while doing it. Still have a R9800Pro and a Ti4200 in old machines here too, the situation hasn't changed.

That's exactly the problem... I doubt many people ever actually used AF before R300, because it was just too slow. The algorithm on early AF hardware was just bruteforce, so if you used 16xAF, it actually took 16 samples for every pixel. Sure, it gives perfect results, but it's not a practical solution becasue it takes far too much bandwidth.
With R300 you could suddenly use 16xAF with a very modest performance hit. As such, R300 had better AF quality in practice. You could run it with 16xAF (and 4xAA) on pretty much all games, while earlier cards only performed well with trilinear filtering and no AA.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BenSkywalker
Really? With what drivers? I have an 1800xtx sitting here and the AF quality is very poor compared to integrated nV graphics running off a mobo.

Firstly the X1800XTX card never existed :p maybe you mean't the X1900XTX.
Here is the screenshot that shows the HQ AF quality on the X1000 family.

Link

The nV2x parts utterly obliterated the R300 cores in terms of AF quality, it really wasn't remotely close. The nV2x parts were the last we have seen that did full, proper AF. The newer nV cards have gotten extremely good at approximation, but they still can't output the same pixel perfect AF the nV2x parts did. Downside was the nV2x took a catastropic performance hit while doing it. Still have a R9800Pro and a Ti4200 in old machines here too, the situation hasn't changed.

I guess the difference between the two is that one was practical while making trade offs with quality, while the other was not. But with todays hardware and the kind of performance figures we get (and the almost non existent performance hit in using 16xAF in most games), maybe it is time for both IHVs to focus on AF quality.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: evolucion8

While nVidia theorically has the best implementation of AF in the market, the visual difference in games are almost 100% imposible to find unless if you are looking with a microscope or examining carefully and zooming picture shots of the game.
I?d disagree with this and in fact I?d argue that at this time AF offers one of the most obvious differences, if not the most. There are no still images required ? just fire up the games and watch many surfaces shimmer more on ATi parts than nVidia?s, especially the newer ones that exhibit shader aliasing.

Perhaps the masses won?t notice (especially those that have never seen nVidia?s parts), but to me it?s like night and day. After gaming on G80 cards since November 2006, I saw the regression immediately during the 3-4 month period I gamed on my 4850. I was like ?that surface never used to move so much before, what?s going on??

Now pair nVidia?s near-perfect AF with super-sampling and you?ll get images that are really the finest available in consumer space. Old games especially have pixel-perfect surfaces and no amount of ATi?s wide tent blurring will match that. That?s why I?m shocked ATI removed RGSS from Super-AA (a major asset to image quality) and instead resorted to derivatives of wide tent and edge-detect. ATi?s sample patterns are programmable (unlike nVidia?s that are fixed), but they?re squandering that advantage by not offering super-sampling modes.

ATi's edge detect is the best Anti Aliasing implementation currently which offers notable differences in image quality compared to other modes, but it's usability is questionable like nVidia's Super Sampling modes since both are pretty demanding in power.
Well yes, technically 24xAA is better than 16xQ but it?s probably the hardest to spot during actual gaming, if you can spot it at all that is.

While I?m not arguing the visible benefits of increased edge AA by any stretch of the imagination, the fact remains that after 4xAA you start seeing rapidly diminishing returns to edges, so to see large benefits in a scene you have to start anti-aliasing other parts. That?s where super-sampling comes in.

Often the difference between 24xAA and 16xQ is only visible through zoomed images, but the difference between 4xAA and 8xS is huge in-game, especially if the game has a lot of shader or texture aliasing.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: ZimZum
Originally posted by: Keysplayr
Originally posted by: thilan29
Originally posted by: SSChevy2001
Also ATi Stream encoding works just fine with Cyberlink PowerDirector 7 and Espresso MediaShow.

It's transcoding unfortunately. I have yet to see encode acceleration...I don't know if it's even possible though. What I'd like is to be able to take some HD-DVDs I have and encode them into H264 or something similar with GPU acceleration.

What would be an example of true encoding? What would you use encoding for and how would you do it? Can you give a couple of examples?

Transcoding is converting from one compressed encoded format to another compressed encoded format.

Encoding is going from an uncompressed un-encoded format, to a compressed encoded format.

If you want to put some BR or Regular DVD on a media server or HTPC in a compressed format, thats encoding.

Currently I dont think either Nvidia or ATI has true hardware accelerated video encoding.



That's transcoding as well. Blu-Ray is also compressed video.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Scali

They added more features. That isn't equivalent to improving the quality 'much' as I said.
Uh, what? Those features are responsible for significantly increasing the quality of AA thereby demonstrating the inaccuracy of your comment. That?s why I mentioned them.

What are you actually trying to argue, exactly?

I thought it was common knowledge that nVidia indeed has a near-perfect AF, while ATi still shows that some angles are considerably worse than others.
This has almost nothing to do with angles, it?s the fact that the surfaces are under-filtered on ATi?s parts so they visibly shimmer more than nVidia?s parts, even the angles that get full filtering. The angle issue is but a tiny factor in the grand scheme of things.

Also, it probably gives ATi an unfair advantage in benchmarks, because they are effectively doing less work, saving on bandwidth and all that, by not filtering as accurately as nVidia does.
This I agree with; it?s the exact reverse of what we saw during the GF7 days when nVidia?s parts under-filtered compared to the X1xx series.

Going back to AA... yes, modern cards may have more than just the 2x, 4x, 6xMSAA that you had with the 9700. But how many people actually use those?
If you?re going to use the appeal to popularity logical fallacy then taking it to the natural progression, nothing outside the GMA matters, and neither does any resolution over 1024x768.

The algorithm on early AF hardware was just bruteforce, so if you used 16xAF, it actually took 16 samples for every pixel. Sure, it gives perfect results, but it's not a practical solution becasue it takes far too much bandwidth.
What on Earth are you talking about? Early nVidia hardware couldn?t even do 16xAF, and ATi?s parts performed far more hacks than even the R300, which in turn did less work than parts since the X1xx.

Furthermore 16xAF doesn?t mean 16 samples (LMAO), it means 64 or 128 samples, minus any optimizations for a given surface.

With R300 you could suddenly use 16xAF with a very modest performance hit. As such, R300 had better AF quality in practice.
I agree with the significance of the R300 as it offered usable 16xAF in every game with pretty good IQ at the time. Since I had that card, 16xAF has become a mandatory minimum for me in gaming.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: SickBeast

I agree with you, Scali, and my guess is that the AMD parts can perform quality AF, but the performance hit would be embarassing for AMD.
No, the AF algorithm is fixed at the hardware level. So unless they?re hiding it, no driver update is going to improve things.

ATi always had the best AF when the 9700/9800 cards were around. Their AA/AF went out the window when the 1800XT was released.
I?m not sure where this comes from.

R3xx vs NV3x: The NV3x filtered more angles, but the R3xx could take up to twice the samples, and was much faster. You could argue both ways with this one (I personally side with ATi here).

R5xx vs G7x: HQ on the R5xx was superior as it shimmered less and filtered more surfaces, especially compared to the G7x?s default driver settings. G7x HQ shimmered much less but it took a steep performance hit and still didn?t filter more surfaces. R5xx actually improved on the R3xx/R4xx because it filtered more angles.

Here?s a little write-up I did about HQ vs Q on the G7x back in the day:

http://episteme.arstechnica.co...7909965/m/999005329731
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Firstly the X1800XTX card never existed maybe you mean't the X1900XTX.

Nope, it is a 1800xt, thought it was a 1800xtx with the monster heatsink and leaf blower that would never spin up, but I just checked the box :p

Here is the screenshot that shows the HQ AF quality on the X1000 family.

And even in those screenshots you can see moire, not saying the 7900GT looks great either, although I don't own any of those or boards built with a comparable generation GPU. Current integrated nV graphics make either one of those look straight out horrid in comparison. If I just want sharper textures and don't care about aliasing I can adjust the LOD bias, ATi's AF isn't that bad, but it is somewhere in between that and nV's AF.

I guess the difference between the two is that one was practical while making trade offs with quality, while the other was not.

I used AF on the Ti all the time, AF first, AA if I had performance available. Certainly it was much slower then the R9500Pro- I actually traded that board off as the drivers and IQ were so poor that's how I got the Ti- but by the time the R9800Pro was hitting the Ti was just far too slow to run current games with AF at all anymore. After getting used to the AF on the R300 I was shocked when I got the 1800 at how much better the AF was, but then that board was replaced by a G92 part and it was a major eye opener. The R200 also offered AF with performance hits comparable to the R300, that looked even worse though.

But with todays hardware and the kind of performance figures we get (and the almost non existent performance hit in using 16xAF in most games), maybe it is time for both IHVs to focus on AF quality.

With nV's current AF I have a hard time spotting differences between it and the nV2x parts. Enabling VSync and firing up an old game most of the time the newer parts look slightly better as they support a higher level of anisotropy then the nV2x(they were limited to 8x). If you look for it you can catch some off angle areas that are better on the nV2x compared to current parts if you run the side by side, but in a realistic sense nV's current AF is very close to 'perfect'.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: BFG10K
Uh, what? Those features are responsible for significantly increasing the quality of AA thereby demonstrating the inaccuracy of your comment. That?s why I mentioned them.

Since I use the subjective qualifier 'much' in my comment, the accuracy of the comment cannot be determined in the first place.
Other than that, they added extra AA modes, which required more processing, and as such were not very practical.

Originally posted by: BFG10K
What are you actually trying to argue, exactly?

I could ask you the same thing. I just made a remark that ATi hadn't improved AA/AF much. You them pull the AA out of context and go on some off-topic egotrip AGAIN.

Originally posted by: BFG10K
If you?re going to use the appeal to popularity logical fallacy then taking it to the natural progression, nothing outside the GMA matters, and neither does any resolution over 1024x768.

Rubbish.

Originally posted by: BFG10K
What on Earth are you talking about? Early nVidia hardware couldn?t even do 16xAF, and ATi?s parts performed far more hacks than even the R300, which in turn did less work than parts since the X1xx.

I'm talking about the fact that early hardware took a bruteforce approach.

Originally posted by: BFG10K
Furthermore 16xAF doesn?t mean 16 samples (LMAO), it means 64 or 128 samples, minus any optimizations for a given surface.

You mean that in practical implementations it is usually 64 or 128 TAPS, which is not exactly a given, it's hardware/driver dependent. It certainly doesn't MEAN that it will do 64/128 taps.
16x MEANS that your texels may have a max anisotropic level of 16x after being projected on screen, namely the width:height ratio is 16:1 (or 1:16, depending on the orientation).
That is the level of anisotropy the filter is expected to deal with.
If you want to try and look smart, at least get your facts straight. Now you just look like an idiot trying to correct someone with the WRONG info, LMAO.