• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Image Quality: can you tell the difference?

KillaKilla

Senior member
Maybe it's because my eyesight's not to good, but I'm not shure. So I ask you this: can you tell the difference between 2xAA/AF and 8x? I can't. Is there a noticable increase between 1024x768 and 1600x1200? The only thing I've been able to notice is the difference between OpenGL/Direct3D and their softweare counterparts. This is of course, due to the 2xAA and AF(It's forced in my settings).
 
To be honest, usually with the IQ/speed tradeoff I go for speed. Unreal 2004 is a great example. Yes, in theory, I could set the game up to a ridiculous resolution and count the hairs on the head of the guy I'm shooting at. But, I usually just leave it at a decent res (around 1024x768) turn off AA and have plenty of speed. If you're rendering, it matters. For gaming, it's only one factory of many.
 
Most definately. If your eyes are so bad that you can't see the difference you're missing out on a lot of detail in games.
 
Originally posted by: KillaKilla
The only thing I've been able to notice is the difference between OpenGL/Direct3D and their softweare counterparts.


lol, then you are obviously looking at the wrong things as opengl and direct3d are rendering APIs so you really shouldn't see anything different between the two asside from a few missing effects in one when a game was primarly coded for the other.
 
Originally posted by: TheSnowman
Originally posted by: KillaKilla
The only thing I've been able to notice is the difference between OpenGL/Direct3D and their softweare counterparts.


lol, then you are obviously looking at the wrong things as opengl and direct3d are rendering APIs so you really shouldn't see anything different between the two asside from a few missing effects in one when a game was primarly coded for the other.

Run Counter-Strike in DirectX and then OpenGL... pretty big difference in image quality.
 
What monitor and video card are you using that you can't tell the difference between 2xAA+AF and 8xAA+AF?
 
If you cant tell the difference:

a) wash your eyes
b) go to the doctor and get prescription glasses/contact lenses
c) you are one of those lucky enough not to be bitten by the "awesome graphics" bug and will not need to upgrade his videocard for about 2x as long as everyone else (so be happy)
 
Originally posted by: Jeff7181

Run Counter-Strike in DirectX and then OpenGL... pretty big difference in image quality.

the framerate is different as half life just uses a wapper to run in d3d mode, but other than that the lighting is a hair different and you might have to set gl_polyoffset to a negitive to get decals to show, but that is just because the way the game is coded.

but my point is that you can't just look at some graphics and say that well this is d3d and this is opengl, were as you can just look at a screenshot or a game and say if it has aa or af on.
 
I haven't been gaming much the last year, lets put that up front.

However,


When I visit various sites that show different screen shots at different video card/game settings to compare them I'm not fully appreciating the *majesty* of 8x versus 2x.

Is it the still shots that aren't conveying the difference?

For, if the difference is housed in those still shots, then I can safely write off about 75% of all the opinions I read on these boards concerning this issue.


If you tell me the difference is only truly noticable during game play, then I have no comment. Well, almost none, at any rate.


-SynapticBliss
 
Originally posted by: TheSnowman
Originally posted by: KillaKilla
The only thing I've been able to notice is the difference between OpenGL/Direct3D and their softweare counterparts.


lol, then you are obviously looking at the wrong things as opengl and direct3d are rendering APIs so you really shouldn't see anything different between the two asside from a few missing effects in one when a game was primarly coded for the other.

I mean that I can tell the difference between using a hardware based renderer and using the software's renderer.

Also, I can't tell the difference between OGL and D3D. Also, I use a Radeon 9800Pro currently.
 
You need a bigger monitor.. I'm guessing you're using a 15-17" screen.
If you're using a nice 21" screen, you should easily see the difference.
 
i havnt noticed a big difference in my last few video card upgrades, but i remember upgrading from my quantum3d sli voodoo2 card to a voodoo3 and it was a lil faster but i thought the IQ was better with the voodoo2. i love how far cards have come since.

as for aa, i think its more important for people with smaller monitors who have to play at lower resolutions than it is for people with 19"+ monitors that play at high res.

af i honestly can say i never really noticed if its on or not. i prolly just dont notice it cause im not looking for it. what does it actually do anyways? i think ill go take some screenshots at 1600x1200 with different aa/af settings to see what i notice.

JBlaze
 
Wow, multiple poll questions. That's a neat feature. 😎

As for the questions, there are massive differences between each set. In fact I can see the difference between 8xAF and 16xAF.

Run Counter-Strike in DirectX and then OpenGL... pretty big difference in image quality.
That has nothing to do with the API; it has everything to do with how the game is coded.
 
JonnyBlaze thanks for your screen shots. I think these illustrate my point very well. Considering the first three, a difference between them isn't immediately apparent. By immediately, I mean a glance, a glimps, a momentary viewing. With a touch more time, albeit not much, the floor does look more impressive with the 16 AF but not by a glaring amount.

Between the second and third shots you provided I see no visual difference without scrutinizing the pictures.

I didn't view the zoomed shots, no need to. Any differences viewed there would be next to useless.

Just like the golden ears some audiophiles claim to possess there are golden eyes gamers will profess to posess. If ya got them peepers, well and fine for you, but to make recommendations based on your heightened retinal fortitude, I think, is misleading.

What I will concede is this: a detail here, a detail there...and there...and there --while individually these amount to almost nothing, the truth is they add up to form an impression of what is being viewed and it is this changing impression people may notice. Or they may not.

Are the differences between picture One and Three enough to change my immersion while creeping through the cripts waiting for who-knows-what to spring at me, attempt to evicerate me and leave me for dead? Yes, but not by a whole heck of a lot. Not four hundred bucks worth, anyhow.

-SynapticBliss
 
Here?s a couple of shots from Farcry that really show how much better 8AF can make things look. Load the shots up and click back and forth between them.

no AF

8AF

Big difference.
 
Just like the golden ears some audiophiles claim to possess there are golden eyes gamers will profess to posess. If ya got them peepers, well and fine for you, but to make recommendations based on your heightened retinal fortitude, I think, is misleading.

Misleading is pretending that by judging screenshots you can come up with a statement such as that. Anyone that claims that there is little IQ difference between AA/AF modes and none at all is kidding themselves. Ask someone who can judge them directly, and you'll find a much different observation. The difference is plainly easy to see, no so called "golden eyes" needed.

I have several PC's at home, from budget rigs to highend enough to play most games with the details maxed out...the differences are pitifully obvious. Will the budget gamer give you an enjoyable gaming experience? Sure, but I'll take all the eye candy I can throw at it untill it affects performance enough to make a negative difference, it just makes gameplay that much better.
 
rbV5, makes sense. You said:

Will the budget gamer give you an enjoyable gaming experience? Sure, but I'll take all the eye candy I can throw at it untill it affects performance enough to make a negative difference, it just makes gameplay that much better.

I think you make my point better than I do with this statement, "Will the budget gamer give you an enjoyable gaming experience? Sure..." Then you place my point in context with the second half of that quote. Excellent.

Blastman, I viewed those screenshots, thanks for posting them. There are plenty of differences between these two screen shots. I think these two screen shots bear static witness to rbV5's point quoted above. First screenshot: sure it'll be fun, ejoyable, playable, etc. Second screenshot: all the virtues of the first screenshot...but better.

-SynapticBliss
 
Originally posted by: SynapticBliss
rbV5, makes sense. You said:

Will the budget gamer give you an enjoyable gaming experience? Sure, but I'll take all the eye candy I can throw at it untill it affects performance enough to make a negative difference, it just makes gameplay that much better.

I think you make my point better than I do with this statement, "Will the budget gamer give you an enjoyable gaming experience? Sure..." Then you place my point in context with the second half of that quote. Excellent.

Blastman, I viewed those screenshots, thanks for posting them. There are plenty of differences between these two screen shots. I think these two screen shots bear static witness to rbV5's point quoted above. First screenshot: sure it'll be fun, ejoyable, playable, etc. Second screenshot: all the virtues of the first screenshot...but better.

-SynapticBliss

I disagree... the first screen shot would not be as enjoyable to me. When you're walking around and watch a texture go blurry as the angle between you and the texture changes isn't exactly a quality gaming experience. It detracts from the realism. I turned off AA and AF and played Far Cry and it has a completely different feel than even with 2XAA and 4XAF. It's... sloppy... without AA and AF.

Someone new to PC games might be able to play Far Cry on a Ti4200 @ 800x600 with no AA and no AF and be impressed by it... but I'm use to 1024x768 with 4XAA and 8XAF... anything less just looks bad to me.

One final thought... there's one thing no screen shot will ever show... edge crawling. The shimmering, crawling effect aliasing provides as an object, or the view moves. Anti-aliasing reduces that... and that's something you'll never experience by looking at a screen shot.
 
Back
Top