anyone who's had both a 7800gt/gtx and x1800/x1900

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
nothing having to do with performance, but rather image quality.

what is your opinion on the ati highQ AF? noticeable difference from 7800?

also what is your opinon on the overall AA differences? transparency AA make a difference?

thanks for the input!
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
ATI has better image quality. NVidia is faster.

*Hops in anti-flame mobile and drives off*
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.

*Hops in anti-flame mobile and drives off*

thank you; appreicate your time, but that's not what i asked.... and i'm also not looking for someone who has formed an opinion by listening to their friends or one website or another, rather someone who has had hands on experience with both (and assuming that's not you by the x800 in your sig). i've had both x800 and nv40 and was happy with the IQ of both.

what i'm interested in now is the new AA modes in the g70 and new AF modes in the r520/580. more specifically, how they compared in "hands on" evaluations over multiple games - something you kind of need to own one to be able to give.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.

Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
I've only had my 7800GT, no ATI, but I love the quality of it. The Transparency Supersampling AA makes all the difference.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.

*Hops in anti-flame mobile and drives off*

You better keep on driving ;)

Actually with AA they seem to be equal, but ATI has an ever so slightly less drop in performance when it comes to AA (as in you will not notice it most of the time).

With AF ATI seems to have a slight lead, but both are comparable. And performance hit is less with ATI again here but not by that much.

And with HDR+AA, yes it's a nice feature, and it's usefull (and I want it)

*looks at 7800GT's in SLI and cries* :(

(*also hops in anti-flame mobile*) ;)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.

Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
Those are some very impressive scores on the HDR+AA Far Cry stuff. The only problem is that in newer games I highly doubt things will be playable at that resolution, which sucks for people like me with a 2405FPW.
 

mylok

Senior member
Nov 1, 2004
265
0
0
I currently have an ATI 1900xtx which replaced my BFG 7800gtx oc (I have owned 2 GTX's). I have noticed a difference in IQ I prefer the ATI, the colors are not as vibrant on the GTX. The HQ AF on the ATI is much better than the AF on the GTX. AA is about the same but I take less of a fps hit with the ATI. I currently have 3 6600gt's for the wife and kids and the 1900xtx for me and I prefer the IQ on the ATI.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
Originally posted by: mylok
I currently have an ATI 1900xtx which replaced my BFG 7800gtx oc (I have owned 2 GTX's). I have noticed a difference in IQ I prefer the ATI, the colors are not as vibrant on the GTX. The HQ AF on the ATI is much better than the AF on the GTX. AA is about the same but I take less of a fps hit with the ATI. I currently have 3 6600gt's for the wife and kids and the 1900xtx for me and I prefer the IQ on the ATI.


rich guy :p

My family GPU's

Mom: Geforce FX5200
Dad: Geforce 6600GT on Zalman VF-700Cu
Sister: Radeon 9200
Me: Geforce 7800GT on Zalman VF-700Cu LED
 

kingdomwinds

Member
Dec 18, 2004
164
0
0
what about framerate? Im not happy with my 7800 gtx 256mb. How much diff between the x1900xt and 7800 gtx 256mb?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.

Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.



Wow that's crazy!
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: SickBeast
Originally posted by: Elfear
I've had both and I wish I would have taken some screenshots when I had my GTs. I haven't really taken any time to see if I notice any difference (which probably means it's not a huge difference). The thing I am really impressed with is the ability of the ATI card to do HDR+AA. Some people have said that it's an unimportant feature but having played FarCry with the 7800GTX I used to have I can say it is a huge difference. The game looks absolutely gorgeous at 1920x1200 6AA/8AF with HDR enabled and details maxed. It is very playable too in single player mode. Minimum frames sit at ~33-35fps and usually hover in the low 40's in the Training level. I'm amazed that the X1900XT can handle that kind of load but it just chugs right along.

Sorry that doesn't really answer your question about the AA and AF. I'll pay a little closer attention next time I play.
Those are some very impressive scores on the HDR+AA Far Cry stuff. The only problem is that in newer games I highly doubt things will be playable at that resolution, which sucks for people like me with a 2405FPW.

He just said that it is playable at 1920x1200. Were you reading that?
 

stelleg151

Senior member
Sep 2, 2004
822
0
0
Originally posted by: sxr7171

He just said that it is playable at 1920x1200. Were you reading that?

He said newer games and hes right. But I dont know why hes that worried, because playing non native isnt that bad.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: CaiNaM
nothing having to do with performance, but rather image quality.

what is your opinion on the ati highQ AF? noticeable difference from 7800?

also what is your opinon on the overall AA differences? transparency AA make a difference?

thanks for the input!

My honest and unbiased opinion:

As someone that values AF quality very much, to me HQ AF is quite noticeably better than anything nVidia offers. Even the default AF of the ATi card is noticeably better than nVidia's default quality AF. As for the AA differences, I'd say both are on par with neither looking any better than the other. Performance AAA on the ATi card is better than TRMSAA found on the nVidia card but TRSSAA looks ever so slightly better than Quality AAA if you look very very carefully (most wouldn't see it though).

Another subject often not brought up is shadow quality. IMO from what I've seen so far, shadows are rendered slightly better on nVidia hardware than they are with ATi. This could be due to a driver bug, I'm not sure, but in BF2 (despite it's crappy subpar shadows) the shadows on my GTX looked slightly better than they do with my XTX.
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
I have to agree. the shadows in BF2 looked like crap on my X850XT, and while they are infinitely better on the X1900XT, there are still problems. I have to say though, for putting out mad frames with AAA and HQAF maxed, BF2 is gorgeous on my 244t.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I'm going to talk about this in terms of single cards because my CrossFire experience is minimal...

AF: ATI wins this hands down. Once you know that NV's is angle dependent, you can actually watch it change as you move from side to side (change angles). That being said, neither is bad by any means, but ATI definitely looks better.

AA: Both AAA and transparency (esp super sampling) AA are great. After playing with these features, it's hard to imagine games without them. Overall though, I give the AA win to NV for single cards simply because NV's max AA is 8x and ATI's is 6x, and there is a difference. Again, neither is bad, but I think 8x AA just looks less jagged.

HDR+AA: I don't know, I haven't tried it.

Someone else already mentioned that the colors on their ATI card simply looked better, and I have to say that I agree. The colors are overall deeper and just more pleasant to look at with the ATI card. Personlly, I think this is the main reason that people have been claiming that ATI has better IQ all this time.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.

*Hops in anti-flame mobile and drives off*

techNooB :! you haven't been reading alot :! ATI is faster and has better IQ :! I love you technoob :) /me calls in airstrike at TechNooB anti-flame moble

ATI HAS MUCH MUCH BETTER IQ :!
FLAME ME ALL YOU WANT : )

nah seriously :! i changed from 7800GTX to X1900XT and i am very happy with the peformance and the IQ and I love AVIVO :)

if Nvidia 7900GT beats 7800GTX than its pretty awesome card for $299 but if X1800GTO beats a 7800GT and just beats 7800GTX then its much better deal for $249.

I would wait for X1800GTO and 7900GT and 7900GTX to come out because then Nvidia and ATI will go into price war :!

But to your question : ATI HAS MUCH BETTER IQ THAN NVIDIA :!

P.S. Technoob :! you have Corsair ValueSelect 2 x 512GB :? fix it noob :)
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Originally posted by: tuteja1986
Originally posted by: TecHNooB
ATI has better image quality. NVidia is faster.

*Hops in anti-flame mobile and drives off*

techNooB :! you haven't been reading alot :! ATI is faster and has better IQ :! I love you technoob :) /me calls in airstrike at TechNooB anti-flame moble

ATI HAS MUCH MUCH BETTER IQ :!
FLAME ME ALL YOU WANT : )

nah seriously :! i changed from 7800GTX to X1900XT and i am very happy with the peformance and the IQ and I love AVIVO :)

if Nvidia 7900GT beats 7800GTX than its pretty awesome card for $299 but if X1800GTO beats a 7800GT and just beats 7800GTX then its much better deal for $249.

I would wait for X1800GTO and 7900GT and 7900GTX to come out because then Nvidia and ATI will go into price war :!

But to your question : ATI HAS MUCH BETTER IQ THAN NVIDIA :!

P.S. Technoob :! you have Corsair ValueSelect 2 x 512GB :? fix it noob :)

ahahAHAHAHAHAAHAHA......... fixed.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: stelleg151
Originally posted by: sxr7171

He just said that it is playable at 1920x1200. Were you reading that?

He said newer games and hes right. But I dont know why hes that worried, because playing non native isnt that bad.
It may not be "that bad", but I certainly feel that if I buy a new top-end card I should be able to play *any* game maxed out on my screen.

I refuse to buy any card that can't run the UT2007 at 1920x1200 at less than 60fps. I can run non-native now on my X800Pro - why would I buy a new card just to run at non-native resolution again?
 

BassBomb

Diamond Member
Nov 25, 2005
8,390
1
81
i highly disbelieve the X1900XTX will run UT2007 @ that reso + 4xAA /16xAF WITH full HDR and be over 60 FPS
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BassBomb
i highly disbelieve the X1900XTX will run UT2007 @ that reso + 4xAA /16xAF WITH full HDR and be over 60 FPS
That's why my next upgrade will probably be R600/G80. ;)
 

JimmyH

Member
Jul 13, 2000
182
12
81
Originally posted by: SickBeast
That's why my next upgrade will probably be R600/G80. ;)

I doubt they'll do UT2k7 maxed out @ 60fps in a 40 person game of Onslaught. Shakes magic 8-ball, "Signs point to NO"