Resolution has nothing to do with eliminating jaggies. If anything, 1080p would make the jaggies more visible. It's the anti-aliasing you want.
That SSAA needs to be a sparse grid pattern or mixed with rotated grid multisampling pattern for almost perfect IQ.1080p is not even close to killing jaggies. Neither is 1600p for that matter.
High PPI displays are better (e.g. very small high resolution displays like on some phones), but they undersample because their physical pixel counts are low.
What we need are displays with high pixel counts and high PPI. So instead of 4 million pixels on a 30” screen, we need 16 or 64 million in the same space.
With that said, HL2 has a lot of shader aliasing which will be cleaned up with SSAA. If you have the performance, 2560x1600 with 8xSSAA should get you almost perfect image quality.
Not gone, but drastically reduced. I think most have either forgotten or have never experienced Jaggies at 640x480. Now there was a case of Jaggies.
I played Half Life 2 at 1080p, with 8x AA, and still notice jaggies. They aren't BAD, but there are noticable. Were pc gamers over exaggerating 1080p res? I thought it was supposed to kill jaggies?![]()
Resolution has nothing to do with eliminating jaggies. If anything, 1080p would make the jaggies more visible. It's the anti-aliasing you want.
1920X1080 (1080P) is just a resolution. It has no bearing on how developers choose to use textures. Creating textures with less jaggies without using AA/AF means a higher polygon count, which counts against performance. Your GPU is a tool that you can use to deal with these situations.
That was the 3DO version.This was cutting edge at the time http://www.mobygames.com/images/shots/l/362968-doom-3do-screenshot-former-human-up-close-s.jpg and we couldn't get enough of it. I think maybe were getting a little spoiled.
