I swear this is not a troll thread

Dec 30, 2004
12,553
2
76
but when moving from my old 9800GT to my Radeon 4850 1GB,

There is definitely an image quality difference: JAGGIES. When playing Quake Wars: Enemy Territory I never really felt the need to turn on AA with the 9800GT. With the 4850? It was the first thing I thought off-- "wow that looks bad".

AMD/ATI's AA doesn't look that good either, 2x AA just seems to blur the edges. Not attractive. I have to go to 4x for that look to go away.

So yeah I never gave much credence to the ATI vs Nvidia image quality thing, and it was the first thing I noticed. Bleck.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Are you sure the game hadn't turned on AA already for you with the 9800GT? Or that you'd tweaked settings 6 months ago and had forgotten?

I noticed no image quality drop at the same settings when I moved from a 7900GTX to a 4870 in 2008.
 

crazylegs

Senior member
Sep 30, 2005
779
0
71
Are you sure the game hadn't turned on AA already for you with the 9800GT? Or that you'd tweaked settings 6 months ago and had forgotten?

I noticed no image quality drop at the same settings when I moved from a 7900GTX to a 4870 in 2008.

Sounds like a distinct possibility to me also.

Did not notice any image quality differences when I moved from 8800GT / 9600GT to HD4890. However i game at 1920x1200, normmlly with 4xAA and 16xAF. Also i think it has been documented on numerous occasions by various sites how similar the the companies are in terms of image quality.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
Most websites stopped doing screenshots of each cards IQ cause there so similar.
 

Axon

Platinum Member
Sep 25, 2003
2,541
1
76
I've never once been able to note a difference between a nvidia or an ATi card that can run a game at equivalent settings and resolution. The only thing that has happened to me is one company's card stops working before the other's.
 

mhouck

Senior member
Dec 31, 2007
401
0
0
i remember tomshardware had a comparison between the two for prototype this past fall and the only difference I can remember is the shadow darkness.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,044
546
136
Do you still have to 9800gt?
Care to do some side by side pics?
This would be something I would like to see.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
ATI had better image quality in the past but now it's mostly on par with each other.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Ati has the lead in anti aliasing quality, nVidia has the lead in anisotropic filtering, that's why with ATi cards before the HD 5x00 series, had lots of texture shimmering compared to their nVidia counterparts. With current HD 5x00 series, ATi at least matches the nVidia's Anisotropic filtering but theorically, ATi's AF is better.
 
Dec 30, 2004
12,553
2
76
Are you sure the game hadn't turned on AA already for you with the 9800GT? Or that you'd tweaked settings 6 months ago and had forgotten?

I noticed no image quality drop at the same settings when I moved from a 7900GTX to a 4870 in 2008.
Sounds like a distinct possibility to me also.

Did not notice any image quality differences when I moved from 8800GT / 9600GT to HD4890. However i game at 1920x1200, normmlly with 4xAA and 16xAF. Also i think it has been documented on numerous occasions by various sites how similar the the companies are in terms of image quality.

Yeah my 9800gt wasn't powerful enough to do any AA. It could barely keep 60fps when I had everything set up to high/ultra, and I really didn't want to have to give any of those effects up-- so I tweaked the INI so that it wouldn't drop all the way to 30 if it couldn't hit 60-- their engine is odd, if the game can't hit 60 it immediately truncates to 30. Ugly.
 
Dec 30, 2004
12,553
2
76
I've never once been able to note a difference between a nvidia or an ATi card that can run a game at equivalent settings and resolution. The only thing that has happened to me is one company's card stops working before the other's.

Maybe it's just me.
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
This thread seems to be churning out to be a nVidia vs AMD ATI image quality thread.... :thumbsdown:

What the hell are you talking about? The entire premise of this thread is to compare nvidia and AMD image quality.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
OP, you're not alone. I recently went from a GTS250 to a 5770, and although I love the performance of the 5770, some games just look jaggier. I can't even get AA working on BF2 :( Ah well I don't game that much anyway.

On a positive note, I can get 1080p to work properly with no tricks on my big screen with the 5770. It was nigh impossible with the GTS250, and now I don't need a DVI <> HDMI converter either.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
With current HD 5x00 series, ATi at least matches the nVidia's Anisotropic filtering but theorically, ATi's AF is better.

In practice ATi's current AF is still very poor- they use undersampling to cheat now instead of angle dependancy. Unfortunately, this actually looks worse then the way they were using angle dependancy on the 4xxx parts. ATi has never had great AF and that hasn't changed with the 5xxx parts. The overwhelming majority of people are never going to notice the difference, hell Anand hasn't even been able to tell the difference between bilinear and trilinear since the R300 parts shipped :p
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
OP, you're not alone. I recently went from a GTS250 to a 5770, and although I love the performance of the 5770, some games just look jaggier. I can't even get AA working on BF2 :( Ah well I don't game that much anyway.

On a positive note, I can get 1080p to work properly with no tricks on my big screen with the 5770. It was nigh impossible with the GTS250, and now I don't need a DVI <> HDMI converter either.

Do you have every update. I am going to guess you are on Vista and that you do not have the service packs. I had the same issue.

In response to someone stating about the system going to 30 fps if it lacks 60, that is called vsync.

In response to more jaggies I would guess that you are using a different resolution - otherwise there is not much difference in the rendering quality of either company until AA and AF is applied - which they do very differently. Make sure you are using your monitors recommended resolution.

Hope that helps.