Originally posted by: Aikouka
Originally posted by: hans030390
After reading that sentence I found you to not be a credible source. ATI and Nvidia have pretty much the same IQ. Yes, I have seen systems and pictures running on either company's cards.
Oh-ho-ho, here's my stab... after reading
that sentence, I found you to either be blind or a troll. Have you seen the AF charts comparing G80, R5xx and G70? Have you? Come back and say that to me again when you have... I have a feeling that I'll be waiting a long time.
Actually, also, to make you look like even more of an ill-informed punk, now nVidia's IQ is better than ATi's with the G80 compared to the R5xx.
Oh and congratulations, I see you learned to form an opinion and unfortunately, it differs from mine. Welcome to life, learn to live it.
I've seen plenty of AF/AA charts and screens comparing cards. What do you think I'm basing what I'm saying off of? I'm not a blind troll going by just what I think.
I've seen pictures using the best cards out (even the G80) and the only difference you'll see is a very slight difference in AA/AF. By that I mean possibly a few pixels. Not to mention, the pictures are blown up so that you can see the differences (although sometimes I still see none).
It's not realistic to think that everyone will be running at uber-high AA/AF/Resolution settings, even if they are on a nice card. It's also even more unrealistic to think that all next gen games will be running at extremely high AA/AF settings.
So when running on low AA/AF settings, is there really a difference? No. The difference seen on higher settings is only seen if you stop and look at it, OR if you blow up the picture to expose differences.
Don't take me as some "ill-informed" fool. I'm constantly reading reviews on video cards and I have specifically looked at AA/AF comparisons in the past.
Applying all of this to the 360/PS3...there will be no difference in IQ based on the video cards. They will likely run at a lower AA/AF setting anyways, so you'll have to rule out a difference there.
IQ is not all about AA/AF. I've turned it on and off in games and generally I don't prefer it being on of off. So, I save performance by turning it off...and I really don't notice a difference (if I do, I have to stop and stare...and even then I don't care).
Aside from AA/AF, they're both capable cards. They both run Unreal Engine 3 (GoW).
Should we be worried about Resistance "not looking good"? No.
Stop comparing any form of IQ between the two systems based on which company they used for video.
Edit: If you read a recent article about te G80 on anandtech, they have a direct comparison of the G70, G80, and ATI (not sure which). I saw no difference. Maybe it was lack of AA/AF? That's about what you'll see in games. No difference between the companies.