It doesn't take a computer engineer to tell you you shouldn't need a special module. Just add a start/end of frame descriptor to the displayport datastream and don't have the monitor refresh until it sees the end of the next frame. Very, very, very simple. I'm an electrical engineer. It really is that simple.
Of course it's simple. Everything is simple.
I'll give you something to think about.
Maybe you can tell me the answer.
When you have a monitor that can do 30Hz - 144Hz, that actually means that the monitor can display a frame between 7 milliseconds and 33 milliseconds. G-Sync (and FreeSync) assure that everything on the screen looks smooth, as long as the monitor receives a new frame to display from the GPU, at least every 33 milliseconds.
But what does a monitor do when after 33 milliseconds, there is no new frame yet ?
There are 3 options.
1) It doesn't do anything. Result: the screen will turn white, until a new frame arrives. This gives flickering. We don't want that.
2) The monitor displays the last frame again. To do this, the monitor needs to have the last frame somewhere. The G-Sync module has memory with the last frame in it. So the G-Sync module can do this. Free-Sync can not.
3) The monitor depends on the GPU. If the interval between two frames is longer than 33 milliseconds, the GPU needs to resend its last frame.
Now there is one thing that many people tend to forget, when talking about networks. (And yes, the monitor and the PC form a network). Networks never have infinite bandwidth and they never have zero delay. In our case, DP1.2a has an effective bandwidth of 17.28Gbps. Which allows something like 180-190 1440p-frames per second. That means that when the GPU sends a new frame, there will be ~6 millisecondsbetween the first bit and the last bit of the frame.
That means that if the GPU needs to send a duplicate frame, to prevent the monitor from showing white pixels, it needs to make the decision not 33ms after it finished sending its last frame, but 27 milliseconds after sending its last frame. Otherwise the monitor will not have received the full frame when it needs to be displayed.
Did I make any mistakes so far ?
Now what happens if the GPU finished its next frame, right after is started sending the duplicate frame ? Does it stop sending the duplicate frame, and immediately start sending the new frame ? It can't. Because then the monitor will not have received the full new frame before the previous frame has expired. Even if the GPU did stop sending, you'll get tearing on the monitor.
With G-Sync, this problem is easier. The monitor has a copy of the last frame. That means the monitor can decide whether to display the last frame again or not. So now let's look what a G-Sync monitor can do. When the current frame is about the expire after 33 milliseconds, it has to make the decision to display the last frame again or not. So it has 6 milliseconds more time to make that decision !
G-Sync can do even something smarter.
When a monitor has displayed a frame for 27 milliseconds, it can look at its incoming data, and see if a new frame has started to be sent or not. If indeed a new frame is incoming, it can wait up to 6 milliseconds to receive the full new frame. And then display the new frame. The previous frame will not be displayed twice.
Now suppose that when a frame has been displayed for 27 ms, and no new frame is incoming. The monitor can then decide to show the current frame a second time. Note, the minimum holding time for a frame on the screen is 7 milliseconds (on a 144Hz monitor). Now suppose 1 microsecond later a new frame is coming in. It'll take 6 milliseconds to receive the full frame. And 1 ms later, the screen is ready to display a new frame. Hardly deviation from the points in time when frames should have been displayed.
Did I make myself clear ?
A G-Sync monitor can be smoother at low frame-rates. Because it can look 6 milliseconds into the future. A Free-Sync monitor can not.
But yeah, it's all simple. No reason to make things complicated. The G-Sync monitor is all bullocks. Any engineer can see that.