Yes, it will. If it didn't, there would be no need for vsync. This is the crux of the entire problem.
Edit: and now you're changing your post.
Look, I'll try to explain.
The graphics card renders to a "frame buffer" (a chunk of memory set aside for this purpose), which is then scanned and sent out to the monitor, which draws it on the screen. However, this process is not instantaneous, and the number of times per second that the monitor redraws what's in the frame buffer is known as the "refresh rate".
Nowadays, you usually switch back and forth between two or three frame buffers, so that you don't get flickering or artifacts if you have to draw an image in multiple steps. This is called double/triple buffering, and is not really related to the problem we're talking about in this thread.
In all modern video cards, it is possible to write to the frame buffer while the monitor is drawing it to the screen, in order to minimize delays between frames. When you do this, invariably you end up rewriting the frame buffer in the "middle" of the refresh cycle at least some of the time -- even if you are not drawing more frames per second than the refresh rate. When this happens, on average half of the screen (some upper portion of it) will be from the old frame, and the other part will be from the new frame. If the frames are significantly different, you will get artifacts at the border between the two images. If this happens many times per second, you get something called "texture tearing", where you will perceive a sort of jagged line (or set of lines) flickering on the screen where the border is. This is most pronounced in 3D games, where anisotropic artifacts subtly change as you move your viewpoint around in a 3D environment.
Vertical Synchronization, often abbreviated to "Vsync", is a method to avoid this. It is most useful in a double or triple buffering system, as you get quite a bit of slowdown in a single-buffered video subsystem (you can't draw to the frame buffer at all while refreshing, whereas in a double buffered one you only have to wait to do the switch). Basically, it forces the video card to wait until the monitor is finished with its current refresh cycle before switching frame buffers. This prevents texture tearing and other artifacts, but also limits your framerate to match your refresh rate and sometimes lowers FPS a bit (since, on average, a frame will have to wait 1/2 of a refresh cycle to be drawn). As I find these artifacts to be very distracting, and the monitor cannot really display more than that many frames per second anyway, you almost always want this on unless you are benchmarking.
I hope that clears things up a bit.
And you can't really talk about video cards and refresh rates *without* talking about the monitor and how it fits into the picture, so I don't see why you're objecting to it.