I said that the CPU keeps updating the frames all the time so that it can send the most recent frame to the GPU, whenever the GPU is ready.
The GPU itself can also pre render a certain amount of frames so it always has the freshest frame to display.
I can tell you that you have a very strange idea of how things related to the graphics pipeline and according programming works. Not sure how i can help you here without going too much into detail. But from the CPU point of view it works like this:
1) Process inputs
2) Make an estimation of the next frametime -> this determines the animation for the next frame
3) prepare buffers, setup render state, shaders etc.
4) "Draw" command
5) loop back to 3 as often as necessary/desired
6) "Buffer Swap" command
7) loop back to 1 for next frame
In any case, the GPU might not instantly start with the draw command and the programmer should make no assumption about when the GPU starts, but the command is queued in HW and will be processed eventually. Same for buffer swap, there is only guarantee that buffer swap happens after all previous draw commands have been processed.
So the idea that the CPU processes whole frames and then eventually sending one of the (potentially many) computed frames to the GPU is inherently flawed. Likewise the notion that the CPU sends frames when the GPU is ready is wrong. As explained above the CPU does not make any assumptions about when the GPU is ready, it just queues commands which the GPU will start processing when it is ready. There is no explicit synchronization of the form: wait until GPU is ready -> send next frame. The synchronization happen implicitly.
And the reason why so many new games stutter on so many different systems?
Yup,the thread that is made for 1.5Ghz ancient core runs way too fast on desktop CPUs,so you are right,in stead of running ahead of the GPU by a significant margin it stops from time to time so that the graphics can catch up,result=stutter.
There are many reasons for stutter. One reason should be obvious when you look at the program flow outlined above. The CPU makes an estimate when the actual frame is going to be displayed in order to determine the next keyframe of the animation. Say it assumes 16.6ms relative to the previous frame. It then moves all objects for a time equivalent of 16.6 ms. If then when the buffer swap happens more than 16.6 MS have elapsed you are seeing a the keyframe at the wrong time -> you see stutter.
Typically for PC games, the engine makes an estimate based on the previous frame time while on consoles you often can assume a fixed frame time due to the fixed hardware.
Thats by the way where the term "slowdown" is coming from -> always assuming a fixed frame time when the HW cannot keep pace results in slowdown of the animations. (e.g. you preparing animations with 16.6ms keyframe spacing but display frames with 33.3 ms spacing makes everything running half as fast.)