Originally posted by: hans030390
just get a 6800gt because of shader model 3.0
image quality will be roughly the same between the cards.
Originally posted by: VIAN
The 6800GT is capable of displaying better image quality hands down over any current generation ATI card.
Originally posted by: VIAN
And you are a buritto.
Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.
http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.
Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.
http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y
I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.
Originally posted by: keysplayr2003
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.
Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.
http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y
I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.
I can't help but think that your screen name "Keep it Red" is an ATI reference? If so, how can people use your answers and take them seriously if your favoritism shows up even in your name? ATI and Nvidia image quality are equal these days excluding the 7800 with the TAA/SS.
Originally posted by: KeepItRed
Originally posted by: keysplayr2003
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.
Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.
http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y
I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.
I can't help but think that your screen name "Keep it Red" is an ATI reference? If so, how can people use your answers and take them seriously if your favoritism shows up even in your name? ATI and Nvidia image quality are equal these days excluding the 7800 with the TAA/SS.
Yes for the high-end their about equal, but on lower end models you can compare.
Yes, you didn't read the article, you fanboy. I was using official Catalyst drivers. I used beta drivers at some point for Nvidia, but image quality never changed.I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.
I wouldn't be bitching if it wasn't noticeable, but it is in fact highly noticeable. Go try it before you mock me. AA/AF is considered a image quality enhancement. Using AF will introduce serious texture aliasing. There is very little point in having 6xAA on when all this other crap is moving on the textures. Think about that.Vain's link is mostly about IQ in games with AA/AF. That does not address colorspace or any of the other things a card also does. Of course, like it matters... Most of the monitors cannot display the colors a printer can make (HP dumbs some of the printers down because the monitor cannot produce the correct color.) The OS video subsystem cannot produce higher DPI in many instances and the monitors do not support it (Longhorn/Vista actually addresses this with the Avalon video updates).
So, really it is all relative and unless you compare it side-by-side, you will not notice. I would get what you feel comfortable with. No matter what you buy, it is already obsolete
Originally posted by: munky
I know both companies use a bunch of optimizations for AF, but I think the IQ is pretty close. Actually, Ati cards use gamma-corrected AA, so the edges look smoother and better than on the gf6 series. SM3 is a non-issue here, because few games use it, and already some sm3 effects cause a huge performance hit on the 6800 cards. Future games will be even more demanding, so unless you can get a 6800gt for a similar price as a x800xl, there's no reason to spend $50+ for a feature you might never benefit from.
Originally posted by: keysplayr2003
(which we all know is just a more efficient way of running code and offers no visual improvements)
(Also like to add that there is little proof of SM3.0 causing huge performance hits on 6 series cards because there are only "Patched" SM3.0 titles to test with and not ground up SM3.0 titles.
I believe FEAR may be a ground up SM3.0 title and it would seem a 6800GT scores slightly better than a X850XTPE. Correct me if I'm wrong please)
Anyway, Features + slightly better performance + SLI capability = a well spent extra 50 bucks.
Originally posted by: VIAN
I wouldn't be bitching if it wasn't noticeable, but it is in fact highly noticeable. Go try it before you mock me. AA/AF is considered a image quality enhancement. Using AF will introduce serious texture aliasing. There is very little point in having 6xAA on when all this other crap is moving on the textures. Think about that.
Originally posted by: blckgrffn
$50 more for slower performance in Far Cry (why don't more games use this engine???) and HL2/CS:S doesn't sound great to meThe games the OP intends to play could have a ot of bearing on what card he chooses. If he is a CS nut, then there really isn't much a of a question in my mind. If he plays D3 over and over... well, you know where I am going iwth that one
Nat