Then would you explain why the framerate counter does drop to 30 in some games as you just admitted in a recent post, while it doesn't in others? Do you have an explanation other than triple buffering?
The reason FPS counters show NON 60fps/30fps/20fps values is because FPS counters on the whole are AVERAGES. They're an average number of frames usually counted over a 1 second period.
Part of the misunderstanding of vsync is that it "caps" your frame rate, but this is only what appears to be happening and is an easy way to describe it's approximate behaviour. What is happening in reality is the frame is being displayed for an entire refresh, when the next refresh starts if the next frame is ready (has been fully rendered) the buffers are flipped out and the next frame is displayed, if the next frame isn't ready the current frame is displayed for another whole refresh.
This actually dependent on the individual frame render time and NOT the frame rate, it means that while you can maintain over 60fps on AVERAGE, individual frames inside that 1 second of measurement may actually take longer than 16.666ms to render (1000ms/60hz), in this case those frames taking longer than 16.666ms force the prior frame to be displayed over 2 refreshes and cause the effective frame rate (what you see in a v-synced reality) to be lower than 60fps.
In fact you can have average frame rates of way above 60fps and still not see 60 distinct frames in 1 second with vsync on.
Fore example: You could have an average of 100fps (10ms to render the AVERAGE frame) but in reality it could be that half the frames were taking 2ms, and the other half were taking 18ms to render. The 18ms frames cannot be finished in the 16.666ms window and so cause the prior frame to repeat for a 2nd refresh, causing a measured/effective fps when vsynced of 30fps.
But average those numbers.
((18ms * 30fps) + (2ms * 30fps) / 60fps)
((540 + 60) / 60fps)
(600 / 60fps)
= 100fps
Oh look, it's a 100fps average, so an average count for every 1 second would give 100fps, but vsycned you'd see something like 30fps, and not 60fps like you might expect.
Now we understand that vsync is linked to individual frame times and not average frame rate we can see how we can see average frame rates that are not 60fps/30fps/20fps etc...
Imagine the following series of frame render times, all in ms.
12,10,12,14,16,18,30,35,36,50,82,95,100,150,120,90,43,20,10,8,12,10,10,7,10
That all adds up to 1000ms, it's 25 frames, so we've average 25fps. Notice the first 5 frames are all below 16.666ms and so will all display on their own unique refresh just once, once we hit the 6th frame at 18ms that cannot render in that 16.666ms window and so the prior frame is rendered again. The next refresh rolls around (2x16.666) 33.333ms later and our 18ms frame is ready and is displayed. The next frame takes 35ms to render, but this is longer than 16.666ms of one refresh so the prior frame is displayed over 2 refreshes but 2 refreshes is 33.333ms and so this frame still isn't ready so we display the prior frame for not 2 but 3 refreshes.
So on and so forth, some frames getting displayed not for just 1 refresh, not just 2 but up to 9 refreshes in the case of the frame prior to the one that takes 150ms to render.
That's how you squeeze any number of frames into 1 second not just fixed fractional values of 1/n (60fps/30fps/20fps) etc...