starting to find the whole 'current king of 3d' silly

Plester

Diamond Member
Nov 12, 1999
3,165
0
76
i can't help but look at reviews of 9700pro/FX Ultra/9800pro and wonder if the numbers aren't becoming increasingly irrelevant. i realize that to the .1 % of the computer users worldwide that are consumed by the bleeding edge - many of who hang at enthusiast sites like Anandtech will read this and say 'what's your point dumba$$?', but as someone who certainly has fallen into the .1 % from time to time, i am myself caring less and less.

these cards can all run at high res very fast without AA and AF, sure turning AA on starts to slow them down, but as i recall before i switched to an LCD display, 1600x1200 doesn't need AA enabled anyhow. not to mention the fact that all the bells and whistles in fast moving multi player games are lost on most people unless every now and then you stop and look around and say gee wiz before someone smokes you.

and then the LCD thing comes into play, i realize the true enthusiast probably will stick with a CRT for a long time to come, but many have switched to LCDs and in my case i can't run higher than 1024x768 and if i upgrade then it will cap at 1280x1024 and quite frankly my Ti4600 as well as a host of other cards are more than enough for that.

i understand that more complex games will come out that will feed the flame ad infinitum, but a lot of people don't care if card X can run at 1600x1200 with 4x AA and 8x AF faster than card Y. the history of 3d acceleration is a mighty short history and yet it already flirts with the absurd. just a thought and i know i will get a full dose of people's minds on this.
 

Shade4ever

Member
Mar 13, 2003
120
0
0
Exceptionally noticeable or not, I think that everntually ANYONE who plays with higher settings can & will notice a difference in visuals, eventually. If you CAN make things look great, why shouldn't you?

An intentionally extreme representation of your opinion:
"Well, I'm just interested in the gameplay, so screw it, I'll just stick to B&W @ 480x320. I mean, I'm too busy shooting people to pay attention to color, much less detail!"

No, I'm not flaming you, just trying to get my point across. And I'm not exactly one to talk, w/ a GF3 that I just tested out AA/AF on for the first time lately. But if I bought one of those cards, you can bet I'd have the AA/AF settings up in a heartbeat.
 

Plester

Diamond Member
Nov 12, 1999
3,165
0
76
i use AA all the time at 1024x because i can and everything is playable and looks outstanding, it's those nosebleed resolutions where the big battles between nVidia and ATI's flagships take place that i am referring to.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Sometimes AF can greatly improve the look of a game, texture blending sometimes can look awful, and AF then almost becomes a necessity, even if not so much AA.
 

selfbuilt

Senior member
Feb 6, 2003
481
0
0
I agree that the difference in the "nosebleed" section of Radeon 9700/9800 and GeForceFX5800 series is probably somewhat irrelevant in the long-run. But what about the more mainstream mid-range cards like the Radeon 9500/9600 and GeF4/FX5600 series? From my experience and my review of the various online sites, the key defining difference between the mid-range cards is AA/AF performance at lower resolutions.

For example, on modern systems (i.e. CPU 2Ghz or equivalent and higher) the performance loss on the Radeon cards going from 1024x768 to 1600x1200 is about the same as going from no AA/AF to 4XAA/8XAF at 1024x768 (ie. 10-15% hit on the 9700Pro, 25-30% hit on the 9500Pro in either case on newer games). So, it comes down to your personal preference (and your monitor) which option you go for.

But on the older GeF4s (and to a lesser extent even the FXs), you take a much greater hit going to 4XAA/8XAF. Is that going to be a major issue for you? Maybe not, depending on the rest of your system and the games you play. But does it limit your options? Potentially yes.

In my opinion, the issue should be on whether or not you can run decent AA/AF with a fairly neglible performance hit on the mid-range cards. Sure, you could always play games at 1600x1200 instead, but isn't it nice to have the option?



 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
The technology that's in the mainstream/budget cards had to come from somewhere. It eventually filters down from the high end cards. If there were never any "Kings of 3D" developed, we'd still be using software renderers.
 

temckean

Junior Member
Apr 5, 2003
16
0
0
I agree with the above post. Personally, I like the battle for one reason. I never buy a brand new card. I usually can wait about 6 months and get it for half the price it was released at, sometimes even cheaper with OEM. It will still be a great card and if a brand new one was not being released, it would not lose so much retail value.
 

CurtCold

Golden Member
Aug 15, 2002
1,547
0
0
Yes and No. Some people told me that AA, and AF didn't matter, because they didn't have time to look at the eye candy. Well when I first installed my 9500pro, I played without AA, and AF on Q3 based games mostly. Faster performance, but not a real big difference in quality over my 8500.

That all changed when I enabled AA, and AF. Games are more crisp, and the detail level is much better. Granted I doubt it will help my gaming performance, but games are much easier on my eyes, and I'm impressed with the fact that I can get 6K 3Dmarks with AA, and AF enabled.

AA, and AF make a pretty big difference in 1024*768, which is what I game with on my 17" monitor. I didn't buy bleeding edge, but the 9500 will last me a while.

The difference in 2K3 was the most noticeable. That game looks outstanding with AA, and AF enabled.
 

figgypower

Senior member
Jan 1, 2001
247
0
0
In some games these exceptional details come in handy in sniping, for example. All I'm saying is that, yes you may just be running around killing things, but certain people can have an advantage with the better graphics. It also knocks down prices and makes 3D more affordable in general. Now, assuming that everyone's all even, then the fact is that 3D hardware is ahead of software. But, I'm willing to bet that in a little while, the software (the games) will again raise the bar for 3D gaming and strain the hardware once more. Ignoring that, the whole point of 3D gaming is to seem believable, if not realistic. You may not "realize" the better graphics, but eventually it adds to a much more immersive experience.
 

eklass

Golden Member
Mar 19, 2001
1,218
0
0
i think you're silly

no really though, you don't know what you've got until you've lost it... or something like that...
right now i'm happy pushing 35ish fps at 800x600 medium detail in BF1942 on my geforce2mx
will i be happier with 1024x768 or 1152x864 with AA, AF, & high detail?

you bet your butt i will!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
These cards exist because there is a demand for them. It's as simple as that.
 

Rio Rebel

Administrator Emeritus<br>Elite Member
Oct 9, 1999
5,194
0
0
There's not really a right or wrong here, just opinion.

Actually, I agree with Plester that bleeding edge isn't as important to me, but that's for different reasons. I don't usually play multiplayer games, so the eye candy actually is important to me. I just don't need 100+fps, since I'm not "fragging" online. On those rare occassions when I *do* want to frag, I just turn of AA/AF and enjoy at 1280 resolution or so (using a 2500+ barton with a GF4 4400).

I am getting WAY more enjoyment right now with my DVD burner than I would for the same amount of money in a Radeon 9700 pro. It's all good.
 

nippyjun

Diamond Member
Oct 10, 1999
8,447
0
0
But, I'm willing to bet that in a little while, the software (the games) will again raise the bar for 3D gaming and strain the hardware once more. Ignoring that, the whole point of 3D gaming is to seem believable, if not realistic. You may not "realize" the better graphics, but eventually it adds to a much more immersive experience.


I think that the hardware is already being strained. Personally, I strive to run all games at 1600x1200. With my new video card I can do that (not at 60fps, usually 30-50), but can't use AA or AF due to FPS dropping below 30. Even without 1600x1200 on a game like generals or Unreal 2, I get some stuttering at times. Even the best hardware now is being strained at the highest resolutions with aa and af. I think these current new games have raised the bar.

 

Plester

Diamond Member
Nov 12, 1999
3,165
0
76
i'm starting to wonder what my point was my self now - other than to get a discussion going. demand = innovation = win for the rest of us - so what is my point, i guess i'm sick of reading how the 9800 and the FX do blah blah at 4x AA blah blah blah... maybe i should stop reading reviews until i need a new card.

now i have revealed to all who read this post how my little pea brain operates.
 

Looney

Lifer
Jun 13, 2000
21,938
5
0
Originally posted by: Plester
i use AA all the time at 1024x because i can and everything is playable and looks outstanding, it's those nosebleed resolutions where the big battles between nVidia and ATI's flagships take place that i am referring to.

It sure does matter. Have you tried running AA/AF at 1600x1200 and higher? You definitely do see a difference. Even at 1920x1200 resolution, i see a difference with AA/AF.
 

Insidious

Diamond Member
Oct 25, 2001
7,649
0
0
I usually upgrade when some new game just isn't running the way I want it to. (something about turning off detail and/or shadows just makes
me need to upgrade.

With this in mind, you can probably understand why I am happy with a Ti4400.

But rest assured, when the next resource hog gets published and it's a game I like, I'll be looking at the bleeding edge once again
(of course, I'll then find it costs too much $$ and go one step below it :D )

-Sid
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Moralpanic
Originally posted by: Plester
i use AA all the time at 1024x because i can and everything is playable and looks outstanding, it's those nosebleed resolutions where the big battles between nVidia and ATI's flagships take place that i am referring to.

It sure does matter. Have you tried running AA/AF at 1600x1200 and higher? You definitely do see a difference. Even at 1920x1200 resolution, i see a difference with AA/AF.
I have, with AA the results are negligible at those resolutions, AF helps at all resolutions, but Performance AF is preferable in all instances. Just curious though, what games do you play at 1600x1200 or 1920x1200 with AF and AA enabled?

Chiz
 

Looney

Lifer
Jun 13, 2000
21,938
5
0
Originally posted by: Plester
i wish i could, but we LCD users don't see resolutions like that.

Well then why are you making statements that if you're running in 1600x1200 resolution, you don't need AA/AF enabled?