FWIU there's hardware required on the videocard that is lacking on nVidia. That's why they have to use the complex Gsync mudule mounted in the monitor.
Can you explain your reasoning, please?
Nvidia is not stupid. I very much doubt they would risk being heavily invested in particular technology,
when the same result can be achieved with already available tech, with VESA DP update, firmware update, AND/OR with better scaler.
The idea of sending frames to display at the rate at which they are available, and pacing the display refresh rate to follow GPU frame generation is simple enough.
But every great idea is simple at it's core, and every implementation of this idea becomes a real-world problem - and solving a real world problem is
never a simple task.
So far FreeSync in action has been seen by AMD family members and few of their friends. Contrary to how Gsync had been publicly presented and demoed. And heavily praised(!).
All this is what's leading me to that speculation of AMD having early problems. And Gsync being better, more advanced solution.
It's Xmas season - sales will be up for everyone.
Every G-Sync monitor sold is a customer locked down for foreseeable future. And AMD doesn't even have an engineering sample or convincing demo to proudly show to the world.
Yet come march Samsung is shipping.
Why not yell from the top of their lungs about G-sync killer... weird, no?
FreeSync can go to 9Hz, but I don't know of any monitors that can hold the image that long. ~15Hz is the slowest. It's good to have the monitor be the limiting factor though rather than the videocard.
Saying FreeSync supports 9Hz is a bit convoluted.
It's Display/panel that can go to 9Hz.
And even then, what good is 9fps, not to mention...khm... 30'' at 9Hz(!)LOL?
I doubt there is any need for this, except for marketing purpose.
But hey, why not go for it. It certainly doesn't hurt: 9Hz - 240Hz