So it seems ATI really has the upper hand in AA implementation and quality with the x1800s...

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Ackmed
Not responding to Rollos trolling post.

The difference between what NV did with the FX series, and what ATi is doing now, is giving people the option of turning on better AF. The FX had many problems to start with, and the horrible AF performance was just one of them. If NV had given people the option of turning on the best AF, or not, it wouldnt have been a problem. With ATi's new cards, you can turn on the best AF, or not. By not doing it, you would still get decent AF (on par with NV's), and better frames. Or you could get the best AF avail, and lower frames.

But didn't ATI cause the problem of crapppy AF in the first place with the 9700 series?
 

Madellga

Senior member
Sep 9, 2004
713
0
0
Originally posted by: Rollo
Originally posted by: Madellga
Originally posted by: CaiNaM
anyone out there so disguted with the iq of their 7800 gt/gtx cards that they're willing to have a "fire sale" to get rid of theirs so they can buy an 1800xl? i'd be willing to pay $200 for some of these broken cards with crappy IQ.

Last time I checked (last weekend) I could sell it for 400 or more. Better to lose 50 bucks than 450.

The difference from the others is that I have the courage to say I bought I bad product.

Hum, maybe that will spread and lower the card's value !? From now on, I will say the GTX is great and you should buy it.

New reason: I am selling my GTX because I need to pay my bills and need the money.

I think you should link to all your anti- nVidia posts in your ad, so people can get an idea what you think of the card you're selling?

;)

That's sort of a sad story, liquidating your video card to pay the phone bill. :(
Hope things look up fo you.


One thing though. My post was about something the thread is discussing: IQ quality.

You say I keep repeting myself? What about you? You post in every thread on this forum about:
1) ATI not available
2) ATI inside trader
3) ATI reduced warranty
4) ATI bla bla bla

You are a true FUD and OT poster. Mostly trolling all threads with a kind of polite/ironic approach. If someone doesn't agree with you, it is anti Nvidia. You live in a binary world: pro and contra.

But you are anti ATI guy, that's for sure.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Ackmed
I read fine. I am aware of these issues. These issues do not mean that the 7800 series is "broken". AA and AF are working as planned. The shimmering problem seems to be ok with them, because they dont seem to have any attempt to fix it, other than for HQ. I wouldnt go as far as calling the IQ of the 7800 series "crappy", its just not as good as the X1800's, in my opinion.
so if they (as in nv) don't want to fix it, it's not broken? kind of like hear no evil, see no evil.... close your eyes and it's not there :)

The difference between what NV did with the FX series, and what ATi is doing now, is giving people the option of turning on better AF. The FX had many problems to start with, and the horrible AF performance was just one of them. If NV had given people the option of turning on the best AF, or not, it wouldnt have been a problem. With ATi's new cards, you can turn on the best AF, or not. By not doing it, you would still get decent AF (on par with NV's), and better frames. Or you could get the best AF avail, and lower frames.

i'm not sure i follow.. ati forced you to have crappy af, with no option to make it better. how is this a bad move by nvidia? all i'm saying it's what's good for the goose is good for the gander, so to speak. it's hypocritical to all of the sudden think AF quality is the focal point when the shoe is on the other foot.imo the bad move by nv was by following ati's lead and "dumbing down" the af in nv40 to ati level.

that being said, i'm all for better texture filtering and improved IQ. kudos for ati for moving back in that direction. i just find it annoying that, when ati changed the 3d landscape by "optimizing" filtering and degrading image quality, there was no similar uproar from that ati camp, that's all.

 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: CaiNaM
Originally posted by: Ackmed
I read fine. I am aware of these issues. These issues do not mean that the 7800 series is "broken". AA and AF are working as planned. The shimmering problem seems to be ok with them, because they dont seem to have any attempt to fix it, other than for HQ. I wouldnt go as far as calling the IQ of the 7800 series "crappy", its just not as good as the X1800's, in my opinion.
so if they (as in nv) don't want to fix it, it's not broken? kind of like hear no evil, see no evil.... close your eyes and it's not there :)

The difference between what NV did with the FX series, and what ATi is doing now, is giving people the option of turning on better AF. The FX had many problems to start with, and the horrible AF performance was just one of them. If NV had given people the option of turning on the best AF, or not, it wouldnt have been a problem. With ATi's new cards, you can turn on the best AF, or not. By not doing it, you would still get decent AF (on par with NV's), and better frames. Or you could get the best AF avail, and lower frames.

i'm not sure i follow.. ati forced you to have crappy af, with no option to make it better. how is this a bad move by nvidia? all i'm saying it's what's good for the goose is good for the gander, so to speak. it's hypocritical to all of the sudden think AF quality is the focal point when the shoe is on the other foot.imo the bad move by nv was by following ati's lead and "dumbing down" the af in nv40 to ati level.

that being said, i'm all for better texture filtering and improved IQ. kudos for ati for moving back in that direction. i just find it annoying that, when ati changed the 3d landscape by "optimizing" filtering and degrading image quality, there was no similar uproar from that ati camp, that's all.

Uh, I already said I am aware of the problems. I also said that its not broken, because they intended it to be that way. Why cant you understand that?

AF from previous ATi cards was far from crappy. It just wasnt as good as it is now. Show me where I said that AF wasnt important when NV had better AF than they do now? Dont call me hypocritical if you cant back it up. Also, The GF4 had better AF than the FX, thats where it all started. As I remember it, the 9700 Pro had as good of AF as the 5800U. Some pics found here side by side.

How about some pics from some games that show these degrading image qualities when you claim that ATi "changed the 3d landscape"?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Ackmed
Uh, I already said I am aware of the problems. I also said that its not broken, because they intended it to be that way. Why cant you understand that?
your exact comment was, "shimmering problem seems to be ok with them, because they dont seem to have any attempt to fix it.", which seems to state if they (nv) don't acknowledge it, the problem "isn't a problem".... did you intend to say something else?
AF from previous ATi cards was far from crappy.
OMG.. now the AF on previous ati cards is "okay"? they are the same quality as with current g70 cards (the algorithms are almost identical).
It just wasnt as good as it is now.
and nv's is no worse now than it was in last gen's ati cards....
Show me where I said that AF wasnt important when NV had better AF than they do now? Dont call me hypocritical if you cant back it up.
why on earth do you assume it's all about you? i never singled you out specifically, i referred in general to the rather vocal ati camp..
Also, The GF4 had better AF than the FX, thats where it all started.
as i stated in an earlier post in this thread.
[/quote]As I remember it, the 9700 Pro had as good of AF as the 5800U. Some pics found here side by side.[/quote]
you remember incorrectly. heh.. r300 started the term, "brilinear".. or has that slipped from memory?

at any rate, ati only filtered certain angles, thereby doing much less work than nv with a resulting performance increase, but at the cost of image quality. at certain angles, it could be considered comparable, but others weren't even being calculated.. and do we have to bring up the whole ati af cheat where they were forcing bilinlear filtering when colored mipmaps were detected?

How about some pics from some games that show these degrading image qualities when you claim that ATi "changed the 3d landscape"?
lol... i'm not even going to respond to that.. there is such an abundance of articles/discussions/conclusions on the while "optimized af" ati brought to the table with r300, it's asine to even try and argue the point.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: southpawuni
It would be absolutely ridiculous to buy a X1K based off of that very small, minute difference in AA/AF quality and give up all the Nvidia advantages of drivers/SLI/heat/power/single slot solution/price/availablility.

I think at this point, any openminded unbiased person would admit the Nvidia solutions are superior to ATI.

I'll give ATI superior AA/AF though, although slightly and it really doesnt bring enough to the table to merit a purchase over a competing Nvidia solution.
And the AA quality really means a whole lot less considering FPS are the dominant genre today.

Welp, I'm off to play my DVD special edition of Quake 4 online!

southpawuni, you're just like Rollo - you downplay the disadvantages and go overboard embellishing your card of choice (which also happens to be the same company).

The X1K series has better image quality period. AA is a bit better; AF is no contest anymore.

The Nvidia advantages you list, I will break down.

Drivers: I disagree wholeheartedly. I think ATI's drivers are excellent, and I prefer their scheduled driver updates to Nvidia's ad-hoc "we'll release WHQL drivers when we feel like it" attitude. Although I wouldn't say either driver is leagues better than the other.

SLI: Nvidia wins this one, although ATI at least has the option of dual cards in Crossfire. Regardless SLI's main advantage is that you can run two of the fastest cards out there (since many mid-range cards in SLI are only equivalent to a single top-of-the-line card, and often cost the same). So SLI will always be a niche.

Heat: sure, the X1K series produce more heat. Personally, I'd buy an aftermarket cooler for any GPU I was keeping longterm these days. Nvidia were the heat-mongers of the day in the NV30 days and up, so now the tables have turned. Regardless, G70 isn't exactly a "cool" running GPU either, it loads in the 70's and up!

Single Slot solution: some people actually prefer dual slot these days. Once again Nvidia brought dual-slot first to the market, yet apparently you shun their invention...

Availability: true for the X1800XT. We'll see if ATI keeps their word soon enough.
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Originally posted by: Rollo
You know what's hilarious about this?

Back in the days of the 5800/5900 I used to post the AF test wheel all the time, when the situation was reversed.

And you know what?

The ATI Defense League used to post in reply,"Jesus, not the damn AF wheel again Rollo! You don't see such things in a game!"

My guess is some of the people here saying this AF difference is the end of civilization were among them.

Heh- the more things change, the more they stay the same. ;)

BTW- should I start posting "Not those AF wheels again!"? Nah, guess not.
Are you a part of the nVidia defence league?

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Madellga
But you are anti ATI guy, that's for sure.

LOL- oh noes!

Lately I am, that is for sure. If they would bring out some interesting merchandise they would go a long way toward getting back in my box again?

 

Tanclearas

Senior member
May 10, 2002
345
0
71
Originally posted by: Rollo
LOL- oh noes!

Lately I am, that is for sure. If they would bring out some interesting merchandise they would go a long way toward getting back in my box again?

That's what the blonde said to the sailor.

:D
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: CaiNaM
Originally posted by: Ackmed
Uh, I already said I am aware of the problems. I also said that its not broken, because they intended it to be that way. Why cant you understand that?
your exact comment was, "shimmering problem seems to be ok with them, because they dont seem to have any attempt to fix it.", which seems to state if they (nv) don't acknowledge it, the problem "isn't a problem".... did you intend to say something else?

No, I said what I ment. They intended it to be that way. As sad as it is, they dont show any interest is getting rid of it in Quality mode. The default that the drivers are in, and what most people will use.

Originally posted by: CaiNaM
AF from previous ATi cards was far from crappy.
OMG.. now the AF on previous ati cards is "okay"? they are the same quality as with current g70 cards (the algorithms are almost identical).

Im not the one who said the AF on 7800s was crappy, that was you. In fact, I said it wasnt crappy earlier.

Originally posted by: CaiNaM
It just wasnt as good as it is now.
and nv's is no worse now than it was in last gen's ati cards....

I never said it was.

Originally posted by: CaiNaM
Show me where I said that AF wasnt important when NV had better AF than they do now? Dont call me hypocritical if you cant back it up.
why on earth do you assume it's all about you? i never singled you out specifically, i referred in general to the rather vocal ati camp..

I assume it because you've quoted me and said it several times.

Originally posted by: CaiNaM
Also, The GF4 had better AF than the FX, thats where it all started.
as i stated in an earlier post in this thread.

Glad you can agree.

Originally posted by: CaiNaM
As I remember it, the 9700 Pro had as good of AF as the 5800U. Some pics found here side by side.
you remember incorrectly. heh.. r300 started the term, "brilinear".. or has that slipped from memory?

at any rate, ati only filtered certain angles, thereby doing much less work than nv with a resulting performance increase, but at the cost of image quality. at certain angles, it could be considered comparable, but others weren't even being calculated.. and do we have to bring up the whole ati af cheat where they were forcing bilinlear filtering when colored mipmaps were detected?

Again, where are the game shots? Showing a synthetic color wheel doesnt mean anything, if it doesnt correlate to game quality loss. The link I dropped showed them having the same quality, while the FX took a huge penatly in frames. Calling it a cheat already shows which way you lean.

Originally posted by: CaiNaM
How about some pics from some games that show these degrading image qualities when you claim that ATi "changed the 3d landscape"?
lol... i'm not even going to respond to that.. there is such an abundance of articles/discussions/conclusions on the while "optimized af" ati brought to the table with r300, it's asine to even try and argue the point.

Nice dodge. (again)

This topic is going nowhere. The simple fact is, ATi has better AF, and very arguably better IQ overall. With what seems to be better AA, and AA+HDR in some games. Some people dont like it, and some (Gamingphreek) flat out deny the AF results. This topic has run its course for me. Im done posting about it. The facts are laid out, people can either choose to accept them, or not.

Not once have I posted with a "neener neener" attitude, yet some respond with off-topic trolling. Again, hopefully NV will step up to the plate and get back on par with ATi with the AF quality. Im sure they will, as it seems as soon as one 1-ups the other, the other responds.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I can think of a few times off the top of my head that they've cut support off from products when Nvidia has continued to support theirs.
I never had those products when it happened so it doesn't concern me.

For me, this is a clear and decisive part where Nvidia drivers are clearly better supported than ATI.
I'm actually more interested in not having my system spontaneously reboot rather than supporting some ancient card from 1999.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: golem
But didn't ATI cause the problem of crapppy AF in the first place with the 9700 series?
ATI was the first to sacrifice IQ in order to offer AF at a minimum framerate hit with R200 (aka 8500/9100). They did this by introducing angle-dependence and by only allowing bilinear filtering between MIP-maps. An argument for this is that 16xAF pushes MIPmap boundaries so far back that bilinear wasn't that big of a deal. The angle-dependency was visible in everyday usage, as Serious Sam and Enemy Territory showed.

Originally posted by: CaiNaM
you remember incorrectly. heh.. r300 started the term, "brilinear".. or has that slipped from memory?

at any rate, ati only filtered certain angles, thereby doing much less work than nv with a resulting performance increase, but at the cost of image quality. at certain angles, it could be considered comparable, but others weren't even being calculated.. and do we have to bring up the whole ati af cheat where they were forcing bilinlear filtering when colored mipmaps were detected?
Actually, IIRC brilinear was first detected on NV cards in UT2k3 (and possibly by Dave in B3D's forums, visible with MIPmap coloring). It was their response to the low performance hit (but also lower IQ) R300 took with AF enabled. ATI later introduced what we dubbed "trylinear" (similar but not exactly like brilinear) in their post-R300 parts, starting with RV350 and extending to the R4x0 series (and possibly R5x0). AFAIK, R300 actually didn't do the trylinear optimization with R300 as it didn't have hardware support for it. ATI did default to trilinear on the first texture layer and bilinear on the remaining ones, though. I can't remember if NV did the same (or if they followed or led). I do know that current ATI drivers have a setting to force trilinear on all layers, which I recently used to great effect in Halo.

Just to be clear, brilinear is not the same as angle dependence. They're separate issues. Bi-/tri-/bri-/try-linear all refer to how cards blend MIP-maps at their boundaries (where a texture sample abuts on that's farther and thus lower-res). AFAIK, angle dependence is simply measuring the angle of a texture relative to the camera and applying more or less AF depending on the angle.

R300 improved on R200 in two ways: it allowed for trilinear, and it was more generous in applying AF (IIRC, it allowed for full AF at 45 degrees in addition to 0 and 90). It was still inferior in terms of quality to NV30, NV25, and NV20 because of its angle dependence, but that was a tradeoff ATI made in favor of speed. R520 appears to finally offer AF approaching GF3/4/FX in quality but closer to 8500/9700 in speed.

I don't remember exactly if NV30 had lower-quality AF than NV2x at launch, or if that came later with brilinear. Either way, it was still technically superior to ATI's (and you can see this in the X850 vs. 7800GT AF tester pics in the link below).

BTW, Cainam, that "cheat" (I agree it was dirty but I don't know if it rises to the level of a cheat, b/c IIRC most ppl were hard-pressed to detect a difference b/w tri and try) was that ATI forced full trilinear with MIPmap coloring, but resorted to "trylinear" in regular gaming. The simple way around it is to either just play the game or take some screenshots (er, you can ignore the AF colormaps on the first page :) and focus on the actual in-game screenshots on the following pages).