Lowering the resolution on your monitor down from it's native resolution, while making a game run faster, usually does not speed the game as much as when you run it on that resolution on a different, lower resolution monitor, on it's native output. (As far as I know)
Ex: Gaming at 1680x1050 on a native 1920x1200 monitor is not as fast as gaming on a native 1680x1050 monitor.
This is false in the way that you are asking, however there are a few things to mention:
First, from a framerate standpoint, the native resolution of your monitor does not matter. Something being rendered at 1920x1080 will render the same speed no matter if your monitor is 2560x, 1920x, 640x, or no monitor at all. The video card doesn't care. The actual protocol in which the framebuffer is transfered to the monitor is very similar to how it was twenty years ago. Line by line, down the screen.
From a monitor standpoint, it can matter a little bit more. When running a digital monitor at anything other than its non-native resolution
(in fullscreen mode), it needs to do more processing to make the image fit. The time it takes your monitor to process streching, compressing, or "black-bar-ing" depends on your monitor. A couple monitors take basically no extra processing time for this operation, and some take 30ms to process.