Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.
*Hops in anti-flame mobile and drives off*
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.
*Hops in anti-flame mobile and drives off*
Those are some very impressive scores on the HDR+AA Far Cry stuff. The only problem is that in newer games I highly doubt things will be playable at that resolution, which sucks for people like me with a 2405FPW.Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.
Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
Originally posted by: mylok
I currently have an ATI 1900xtx which replaced my BFG 7800gtx oc (I have owned 2 GTX's). I have noticed a difference in IQ I prefer the ATI, the colors are not as vibrant on the GTX. The HQ AF on the ATI is much better than the AF on the GTX. AA is about the same but I take less of a fps hit with the ATI. I currently have 3 6600gt's for the wife and kids and the 1900xtx for me and I prefer the IQ on the ATI.
Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.
Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
Originally posted by: SickBeast
Those are some very impressive scores on the HDR+AA Far Cry stuff. The only problem is that in newer games I highly doubt things will be playable at that resolution, which sucks for people like me with a 2405FPW.Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.
Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
Originally posted by: sxr7171
He just said that it is playable at 1920x1200. Were you reading that?
Originally posted by: CaiNaM
nothing having to do with performance, but rather image quality.
what is your opinion on the ati highQ AF? noticeable difference from 7800?
also what is your opinon on the overall AA differences? transparency AA make a difference?
thanks for the input!
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.
*Hops in anti-flame mobile and drives off*
Originally posted by: tuteja1986
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.
*Hops in anti-flame mobile and drives off*
techNooB :! you haven't been reading alot :! ATI is faster and has better IQ :! I love you technoob 🙂 /me calls in airstrike at TechNooB anti-flame moble
ATI HAS MUCH MUCH BETTER IQ :!
FLAME ME ALL YOU WANT : )
nah seriously :! i changed from 7800GTX to X1900XT and i am very happy with the peformance and the IQ and I love AVIVO 🙂
if Nvidia 7900GT beats 7800GTX than its pretty awesome card for $299 but if X1800GTO beats a 7800GT and just beats 7800GTX then its much better deal for $249.
I would wait for X1800GTO and 7900GT and 7900GTX to come out because then Nvidia and ATI will go into price war :!
But to your question : ATI HAS MUCH BETTER IQ THAN NVIDIA :!
P.S. Technoob :! you have Corsair ValueSelect 2 x 512GB :? fix it noob 🙂
It may not be "that bad", but I certainly feel that if I buy a new top-end card I should be able to play *any* game maxed out on my screen.Originally posted by: stelleg151
Originally posted by: sxr7171
He just said that it is playable at 1920x1200. Were you reading that?
He said newer games and hes right. But I dont know why hes that worried, because playing non native isnt that bad.
That's why my next upgrade will probably be R600/G80. 😉Originally posted by: BassBomb
i highly disbelieve the X1900XTX will run UT2007 @ that reso + 4xAA /16xAF WITH full HDR and be over 60 FPS
Originally posted by: SickBeast
That's why my next upgrade will probably be R600/G80. 😉