No. it has nothing to do with HDCP.
If your monitor's native resolution is 1680x1050, that's how many pixels it has, and running any other resolution means at least one of the following:
1) Scaling (interpolation)
2) Black bars
If you run 1600x1200 on a 1680x1050 panel, what happens (ideally) is the resolution gets downsampled to 1400x1050 (to match the vertical resolution of your monitor), and displayed with black bars on the sides to fill in the missing 280 pixels. This maintains the correct aspect ratio, and maximizes vertical resolution, while at the same time not losing any content (cropping off edges for example). It is slightly less sharp / detailed than just running 1680x1050 or 1400x1050 natively.
Alternatively, you can stretch out 1600 to 1680, and stretch down 1200 to 1050, which is really terrible, because then your aspect ratio is incorrect.
At least maintain a correct aspect ratio!!
If there's no visible difference, you're just not seeing it!
If you can't see it, and don't care, then consider yourself lucky!
~MiSfit