Originally posted by: Leo V
I am going to add my $.02 to your discussion on whether 1600x1200 is a reasonable resolution to strive for.
My belief is that ultimately it is, however, not at the expense of far more important steps forward such as FSAA, AA, high-detailed geometry, and realistic lighting. I will argue that even 800x600 could be excellent if it enabled significant improvements in other aspects of visual fidelity.
Before going further, note that I run my 21" Trinitron at 1600x1200 desktop res all day. Also, I am not defending GFX or NVIDIA; I have no particular interest in joining another company vs. company debate.
Anyway, I will restate my arguments (that I've used many times before) on why less resolution is often better, contrary to popular belief:
1) The TV argument: television has a resolution approximating 640x480. However, "graphics" seen on TV look infinitely more realistic than those rendered by computer even at 1600x1200. You'll say that is because they are showing "real" live footage on TV. But what about Pixar-type graphics or any other rendered special effects seen on TV? It still looks a million times more realistic than your 1600x1200 game, and the TV's low resolution is certainly not the deciding factor.
My point is that there are other factors to realism vastly more important than screen resolution. They are much more complicated, and deserve far greater attention than simply increasing raw fillrate and drawing more smaller pixels.
2) You see polygonation much easier at higher resolutions. If you noticed how triangular and rectangular Quake (1) monsters began to look once you were able to run above 640x480, you know what I mean. Realism suffers because the vertices/triangles used to build the models become obvious.
3) When everything else remains fixed, you inevitably lose processing time to increasing resolution. This is obvious, either your framerate goes down or you must disable visual effects to maintain the framerate.
Now, some people prefer high resolution even if that means playing at 8FPS, or without any decent lighting whatsoever. However, most of us would not consider this to be a "happy medium" for realistic graphics.
You will naturally argue that this is not a problem anymore, because your latest Radeon/Geforce can tackle even 1600x1200 without a hitch. That is not true, however--a compromise is still being made when you're running at 1600x1200. In this case, the compromise is the primitiveness of the game/application that you're running successfully at 1600x1200. If it was had more complex, realistic graphics, then you would go right back to 1024x768 or even 800x600 to make run it acceptably. And most likely, it would be a favorable tradeoff! (Especially if it was designed with that resolution in mind.)
In conclusion: resolution is by no means bad--however, it only gets you so far. Higher resolution in no way compensates for other graphical capabilities (including software as well as hardware), and can be even detrimental when this high resolution reveals their limitations.
I'm probably not returning to this thread, I just couldn't help my temptation writing about this.