what? going below native res for a game on an LCD is much worse looking than turning down a couple of settings.
are you kidding me?
1920x1200 is 2,304,000 pixels
1680x1050 is 1,764,000 pixels
Rending 30% more pixels is a lot of work and all it gives you is that you avoid a small amount of blur.
Games also go by breakpoints. Going from low to medium can for example triple calculation difficulty (and vastly improve image quality), rendering those extra 30% of pixels can be the difference between unplayable and playable FPS at med setting in the above example, while either will give you a smooth 60fps on low.
Then there is the vram issue, if you avoid running out of vram and caching it in system ram your performance is orders of magnitude better.
All rendering at a lower res does is introduce a slight blur that is not very noticeable in many game types. And might actually work out to give similar positive results to post process pseudo AA we have been seeing recently. (which is useful since you are probably turning AA off entirely in such a situation, although not always).
In some games going from 1920x1200 to 1680x1050 meant going from no AA to using AA, and that improved IQ enough to more then compensate. Turning on ambient occlusion, increasing textures or shadow quality... there are a lot of cases where its simply better to run at lower resolution.
Consoles btw discovered it long ago, all console games are rendered at very low resolution and upscaled. Which incidentally means that those sucky console ports fit much more naturally into such usage model (as their interfaces, while horribad, are at least not impeded by lower resolution; and the engine itself was designed with such stretching in mind)