image resolution whos best?

SBA

Junior Member
Jul 27, 2005
12
0
0
I am in the market for a video card in the 6800gt - x800xl range. Who has best contrast, image quality? I would loose some frame rates for a quality image.
 

df96817

Member
Aug 31, 2004
183
0
0
It's not the video card that directly determines quality, but rather which video card will let you display at higher resolutions and enable the most eye candy such as anti-aliasing and anisotropic filtering. Also the type of monitor you use will play a part in image quality.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
There have been a few threads and reviews about the image quality of Nvidia's cards vs. ATI's. It seems that Nvidia's quality setting is better than ATI's quality setting but that their high quality setting is pretty equal. Someone correct me if I'm wrong.

And yes, what df96817 said is correct. A card's ability to use higher resolutions and AA/AF impacts image quality the most. Both cards will serve you well. They are fairly even and it comes down to what games you play. If they are the same price or you play Doom3 a lot, I'd say go for the 6800GT. If there is fairly big price difference, say $>30, or you play HL2 or BF2 a lot, than go for the X800XL.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
just get a 6800gt because of shader model 3.0

image quality will be roughly the same between the cards.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: hans030390
just get a 6800gt because of shader model 3.0

image quality will be roughly the same between the cards.


Dude, SM3.0 isn't all that cool. Why spend an extra $60-70 for a feature that really doesn't offer much of a performance increase and relativley few visual amenities. I know this has been hashed out before but I don't want the OP misled. I've had both the 6800 series and the X800 series and I really didn't notice anything great about SM3.0. Tried HDR in FarCry but I had to play at 1024x768 to even come close to playable. If the two cards are the same cost than I think it's smart to go with the 6800GT but not at the price break they're currently at.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
The 6800GT is capable of displaying better image quality hands down over any current generation ATI card.
 

pulsedrive

Senior member
Apr 19, 2005
688
0
0
As has been mentioned and fought over throughout these forums MANY times, SM 3.0 will not really be an issue for someone looking for a mid range card. Because by the time games ACTUALLY use it heavily those cards aren't going to be useful anymore anyway, at least at high settings, which is the WHOLE point of SM 3.0 anyway. Yes, I know someone is going to mention SC:CT I don't care.

Back to what the OP was asking. Between the two, as far as "image" quality I prefer my X800XL having had both Nvidia and ATI in my current rig, I have been happier with ATI. I actually started out ATI, then went Nvidia, then RIGHT back to ATI all within about a month, so no net loss for me which was good(return policies and all.) The big thing is that there is not noticable difference between the 2 cards, and the x800xl outperforms the 6800gt at stock in most things, and if you want to dispute this take it up with any of the benchmarks from AT, toms, etc. Now admitedly at the higher end reses, such as anything above 16x12 the 6800 will come out marginally on top. But seeing as how you can usually get a x800xl for about 50 bucks cheaper(at least when I bought mine) than the 6800gt, I would go ATI.

Now I KNOW I am going to get some people talking about future proofing and the fact that ATI doesn't have sm3, but I will point back up into my post and mention again, that most people buying a mid range card and concerned with this kind of thing, because they will probably be purchasing a new card before it is important again.
 

KeepItRed

Senior member
Jul 19, 2005
811
0
0

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.

Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.

http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y


I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.

I can't help but think that your screen name "Keep it Red" is an ATI reference? If so, how can people use your answers and take them seriously if your favoritism shows up even in your name? ATI and Nvidia image quality are equal these days excluding the 7800 with the TAA/SS.

 

KeepItRed

Senior member
Jul 19, 2005
811
0
0
Originally posted by: keysplayr2003
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.

Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.

http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y


I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.

I can't help but think that your screen name "Keep it Red" is an ATI reference? If so, how can people use your answers and take them seriously if your favoritism shows up even in your name? ATI and Nvidia image quality are equal these days excluding the 7800 with the TAA/SS.

Yes for the high-end their about equal, but on lower end models you can compare.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: KeepItRed
Originally posted by: keysplayr2003
Originally posted by: KeepItRed
Originally posted by: VIAN
And you are a buritto.

Incase you missed it, here is my thread detailing that Nvidia does indeed have better image quality than ATI.

http://forums.anandtech.com/messageview...atid=31&threadid=1535724&enterthread=y


I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.

I can't help but think that your screen name "Keep it Red" is an ATI reference? If so, how can people use your answers and take them seriously if your favoritism shows up even in your name? ATI and Nvidia image quality are equal these days excluding the 7800 with the TAA/SS.

Yes for the high-end their about equal, but on lower end models you can compare.

6800GT vs. X800XL. These are not lower end. And what lower end models offer differenct image quality? The entire 6 series offers the same image quality down to the Turbo Cache cards and so does the entire X series from ATI down to the X300SE Hypermemory..

 

JBT

Lifer
Nov 28, 2001
12,094
1
81
I think they offer roughly the same image quality. I've owned a 6800 GT and a X800XT and they look almost identical.

Now for SM3.0 I really don't think the 6800 series SM3 capabilties should mean anything. Once we get games that use these features, it will cause performance to tank on the last gen cards. Now with a 7800 sure it will probably help quite a bit but not on a 6800GT.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Vian's link is mostly about IQ in games with AA/AF. That does not address colorspace or any of the other things a card also does. Of course, like it matters... Most of the monitors cannot display the colors a printer can make (HP dumbs some of the printers down because the monitor cannot produce the correct color.) The OS video subsystem cannot produce higher DPI in many instances and the monitors do not support it (Longhorn/Vista actually addresses this with the Avalon video updates).

So, really it is all relative and unless you compare it side-by-side, you will not notice. I would get what you feel comfortable with. No matter what you buy, it is already obsolete ;)

Edit - Fixed the misspelled name...
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I still see ATi leading Nvidia. Considering your using beta drivers and no Catalyst, I find this innacurate. Also the ATi Call Of Duty and HL2 looked thrice as better than what Nvidia had to offer.
Yes, you didn't read the article, you fanboy. I was using official Catalyst drivers. I used beta drivers at some point for Nvidia, but image quality never changed.

The ATI pictures that you saw for Call Of Duty were from the 9800 Pro which have perfect filtering unlike the X800. The X800 doesn't have perfect filtering and there is no way to disable it, that I know of. Catalyst AI, that is supposed to disable optimizations, doesn't do it. On the other hand, Nvidia has the option to switch from Quality to High Quality, thereby getting near perfect filtering. It's not perfect, but is damn close to it. Any blemishes are mostly unnoticeable. compared to the piles of crap the X800 lays out in from of you.

X800 default VS NV default VS NV High
Here are the pictures you should've been looking at.

Vain's link is mostly about IQ in games with AA/AF. That does not address colorspace or any of the other things a card also does. Of course, like it matters... Most of the monitors cannot display the colors a printer can make (HP dumbs some of the printers down because the monitor cannot produce the correct color.) The OS video subsystem cannot produce higher DPI in many instances and the monitors do not support it (Longhorn/Vista actually addresses this with the Avalon video updates).

So, really it is all relative and unless you compare it side-by-side, you will not notice. I would get what you feel comfortable with. No matter what you buy, it is already obsolete
I wouldn't be bitching if it wasn't noticeable, but it is in fact highly noticeable. Go try it before you mock me. AA/AF is considered a image quality enhancement. Using AF will introduce serious texture aliasing. There is very little point in having 6xAA on when all this other crap is moving on the textures. Think about that.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
But basically you showed us the Quake 3 engine and how driver optimisations (even turning them "off") caused stuff that I had to see side by side screen shots of to notice any difference. I guess that I am not that picky, I just like 16*12 and some aa and af to sharpen details and take the jaggies off of whatever gun I happen to be holding :p

Buy the cheaper one. They both offer competitive performance, bottom line.

Nat

 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Well, if you're not gonna turn on AF at all, there go ahead and buy whichever one you want. But if you wan to use AF, then you are looking at different image quality between them. I wouldn't even have had a problem if they only used filtering optimizations with AF off, but with it on is just retarded. Some textures depending on how they are detailed cover up the optimizations pretty well, but on others it stands out like a kick in the face.

Look at the pics in my previous post. Compare X800 to High Quality Nvidia.

And remember, these are just pics, what does aliasing do when you move around. Aliasing also moves. So even if you see a little difference like the X800 vs the High Quality Nvidia, when you move, the Nvidia makes it look soooo much nicer not to have all that aliasing on those tiles. If you have an Nvidia card you can experience the difference by switching from default Quality to High Quality and turning on AF in game.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I know both companies use a bunch of optimizations for AF, but I think the IQ is pretty close. Actually, Ati cards use gamma-corrected AA, so the edges look smoother and better than on the gf6 series. SM3 is a non-issue here, because few games use it, and already some sm3 effects cause a huge performance hit on the 6800 cards. Future games will be even more demanding, so unless you can get a 6800gt for a similar price as a x800xl, there's no reason to spend $50+ for a feature you might never benefit from.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: munky
I know both companies use a bunch of optimizations for AF, but I think the IQ is pretty close. Actually, Ati cards use gamma-corrected AA, so the edges look smoother and better than on the gf6 series. SM3 is a non-issue here, because few games use it, and already some sm3 effects cause a huge performance hit on the 6800 cards. Future games will be even more demanding, so unless you can get a 6800gt for a similar price as a x800xl, there's no reason to spend $50+ for a feature you might never benefit from.


I disagree. A 6800GT 8 times out of 10 is just a tad quicker than the X800XL. Add to this SM3.0 capability (which we all know is just a more efficient way of running code and offers no visual improvements) (Also like to add that there is little proof of SM3.0 causing huge performance hits on 6 series cards because there are only "Patched" SM3.0 titles to test with and not ground up SM3.0 titles. I believe FEAR may be a ground up SM3.0 title and it would seem a 6800GT scores slightly better than a X850XTPE. Correct me if I'm wrong please) HDR capability, aww crap I'm not going to list all the features. Anyway, Features + slightly better performance + SLI capability = a well spent extra 50 bucks.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: keysplayr2003
(which we all know is just a more efficient way of running code and offers no visual improvements)

Um, except for if they use it to write significantly longer/more complex pixel shaders, or use VS3.0 to do displacement mapping. However, either of these would crush anything shy of maybe a 7800GTX SLI, and so the only thing it's been used for so far is improving shader efficiency.

(Also like to add that there is little proof of SM3.0 causing huge performance hits on 6 series cards because there are only "Patched" SM3.0 titles to test with and not ground up SM3.0 titles.

This effect seems to be minor at best.

I believe FEAR may be a ground up SM3.0 title and it would seem a 6800GT scores slightly better than a X850XTPE. Correct me if I'm wrong please)

Whether it is "ground-up" SM3.0 is somewhat debatable (and it's not clear to me that this would dramatically improve performance in any case), but nobody outside of the game's developers can really answer that. The 6800-class cards do perform better than the X800s, but 1) They all perform pretty badly, 2) It could be for reasons other than SM3.0, and 3) the only numbers we've seen are from an early beta of the game.

Anyway, Features + slightly better performance + SLI capability = a well spent extra 50 bucks.

Really depends on whether or not you're going to use any of the features. The performance differences are generally minor.

 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: VIAN
I wouldn't be bitching if it wasn't noticeable, but it is in fact highly noticeable. Go try it before you mock me. AA/AF is considered a image quality enhancement. Using AF will introduce serious texture aliasing. There is very little point in having 6xAA on when all this other crap is moving on the textures. Think about that.

Huh? Mock you? Never have and defended you a few times because you are right (mostly over in news when the Intel fanboys go bonkers). You misread my meaning and sorry if it was not that clear.

I was pointing out that AA/AF IQ was not the only factor on an open-ended OP question AND that if one never sits down with the two, one could be fat, dumb, and happy with what they have. There are IQ differences for AA/AF. I am not sure which card is more color accurate.

Side note - I do know that the ATI GPU does work a little better with DirectX APIs with the drivers for the one app I have, but for others, I wish I could use a nVidia card for the better OpenGL performance (an NLE for Dx9, an animation package for OpenGL). That has NOTHING to do with which looks better. Hey, you think I could Cross-Fire to a SLI card? Just kidding, Hell is more likely to freeze over.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
$50 more for slower performance in Far Cry (why don't more games use this engine???) and HL2/CS:S doesn't sound great to me :) The games the OP intends to play could have a ot of bearing on what card he chooses. If he is a CS nut, then there really isn't much a of a question in my mind. If he plays D3 over and over... well, you know where I am going iwth that one :)

Nat
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: blckgrffn
$50 more for slower performance in Far Cry (why don't more games use this engine???) and HL2/CS:S doesn't sound great to me :) The games the OP intends to play could have a ot of bearing on what card he chooses. If he is a CS nut, then there really isn't much a of a question in my mind. If he plays D3 over and over... well, you know where I am going iwth that one :)

Nat


What are you talking about? 6800GT vs. X800XL FarCry at 10x7, 12x10, 16x12 all with 4Xaa/4XAF

I do agree about HL2. Once AA/AF is enabled, the x800XL pulls out ahead at 15fps lead at 1600x1200, 12fps lead at 1280x1024, and 2fps at 1024x768. 4x/8x.