[TechPowerUp article] FreeSync explained in more detail

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
G-Sync knows excatly how long it takes. As long as the next frame doesn't need more than 33,3ms the G-Sync module will prevent the display from refreshing. There is no refresh rate for G-Sync monitor until you hit the maximum Hz.
And if it takes longer than 33,3ms the G-Sync module will send the current frame again.


If, however, the source image rate falls beneath the minimum rate of the display device, the IRU has a few options. It could either simply repeat image frames as needed to achieve a rate within the display limits. More generally, any integer multiple of the image source rate with the minimum and maximum display refresh rates can be used. For example, consider a 24 Hz film source and a display with a 30 Hz to 60 Hz dynamic refresh rate ability. In this case the IRU could repeat each film frame, thereby resulting in a 48 Hz image rate. If there is no integer multiple between the image rate and the display rate, then again frame repetition or frame rate conversion from the source image rate to the display rate can be used (as is known in the prior art for fixed refresh rate displays). Further still, frame rate conversion (interpolation) can be used to arrive at a suitable rate, or for the purpose of improving display image quality relative to simple frame repetition.

http://www.google.com/patents/US20080055318
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Okay, and how does G-Sync know this?

there is a chip that is in the monitor. It is synchronized with the GPU. It is actually pretty elegant. the 33.3ms is like a fail safe if the graphic card starts to struggle with a frame.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Okay, and how does G-Sync know this?
With G-sync, the monitor polls. It is constantly asking the GPU once every 1-3ms (not sure specific times, but that is how much latency it is said to have). If the GPU has a finished frame, then the monitor goes into vertical blanking mode and updates the image.

Based on this article, I'm guessing Freesync is going to have to tell the monitor at the end of each finished frame, when to refresh next. This will likely require a triple buffering system, and have it always displaying the frame that is 1 frame behind the action. This may make games feel more laggy.

That said, the article wasn't quite specific enough on Freesync to know for certain how they are handling how to figure out when to start the next refresh, only that it has to figure it out ahead of time.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Why have all this complication? Why not have the monitor hold the frame until a new one is sent down the pipe, all by default? I don't see why VSYNC is even a thing, let alone GSync or FreeSync... why is it even possible to refresh two frames at once to begin with??
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Why have all this complication? Why not have the monitor hold the frame until a new one is sent down the pipe, all by default? I don't see why VSYNC is even a thing, let alone GSync or FreeSync... why is it even possible to refresh two frames at once to begin with??
I suppose, if monitors had a buffer inside them, this would be possible, but they don't. If they had a buffer, then any frame sent to a monitor would be sent to the buffer instead of the screen, and as soon as it is finished receiving it, the monitor could display it. Of course this could cause latency issues if the transfer time isn't all that fast.

Anyways, this is not how monitors work, and they don't have buffers, so Nvidia and AMD have their products coming out. AMD is trying to use an existing tech, which sounds like it may have some limitations as a result, and Nvidia is using new tech, which is going to cost money, and lock you into Nvidia.

Hopefully AMD's system is better than it sounds. It might be, we still have unanswered questions on Freesync.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Why have all this complication? Why not have the monitor hold the frame until a new one is sent down the pipe, all by default? I don't see why VSYNC is even a thing, let alone GSync or FreeSync... why is it even possible to refresh two frames at once to begin with??

Because the display doesn't know when the next frame will come and yet needs to refresh itself.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I wonder if AMD came up with FreeSync before or after G-Sync was released?

All signs point to the technology being there before GSYNC. Whether or not they decided to push it for gaming after Nvidia had the idea, is anyone's guess (I personally think this is a recent decision, and they wouldn't have unless Nvidia had done it).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It sounds more like they were working on a low power use mobile display tech with VBlank. After G-sync came to light, they decided to use it for this use.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
It sounds more like they were working on a low power use mobile display tech with VBlank. After G-sync came to light, they decided to use it for this use.

Right, I think Nvidia had the gaming use case thought up before AMD did. ;)

I wonder if Nvidia laptops will support variable vblank? Or will they need an FPGA, 768MB of ram, and constant hardware polling too? o_O
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think that is incorrect - there was a program linked in the other freesync thread to test your GPU capabilities, and nvidia GPUs do seem to support variable vblank. (edit: actually tests your monitor it seems) As far as working on laptops, that depends on the panel firmware entirely. What you'll find is most laptop screens support it (from what I've read) while desktop screens do not. So that means that desktop monitor firmware will have to be upgraded, but it is not user upgradeable. As well, most monitor mfgrs don't make a regular practice of changing firmware unless there's a manufacturing issue with it.

So it essentially depends on the lcd panel you're using to support variable vblank. Hopefully what happens is, AMD will pitch this to panel manufacturers and get them all to fix their crap. Release new firmware. And get an upgrade RMA type of thing going for users.

The one concern I have is that from reading various analysis of free-sync, it seems that due to the FPGA buffer that the nvidia version will be better and remove input lag. But we don't have enough information on free-sync, as AMD has not released any white paper on it. It's in the very early, rough stage for AMD right now, whereas nvidia has more or less put everything "out there" with their version. But if free-sync is free with minimal cost in terms of getting a monitor with proper firmware and variable vblank, that would be an attractive alternative to a 200$ upgrade module for NV's hardware solution.
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I guess NVidia will design a mask and replace the FPGA with an ASIC eventually, so they can provide enough volume for laptops and cellphones... though I wonder how cellphone manufacturers will react when they learn that they need 768MB more memory and an extra chip on their logic boards to support power saving tech, if they want to use Tegra. I'm sure Nvidia will sell it good though, they're really adept at getting people to play follow-the-leader :thumbsup:
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't know if we're at the point to where many people would want this in phones. I stream netflix, amazon video and game on my phone sometimes but not all too often, I think most users would really not care. Because streamed content doesn't have a variable framerate. And most phone oriented games are designed to have a low framerate (non variable) anyway. I was thinking it could be good for console gaming and HDTVs. Maybe android tablets for gaming. But most tablet gamers are super casual and just game while they're waiting on something or have downtime. Or are on the can. Heh.

Console gaming on an HDTV would be an obvious candidate. Because even the next gen consoles have occasional framerate variances. But not as much as the PC does, since consoles are fixed hardware while PCs aren't.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
...though I wonder how cellphone manufacturers will react when they learn that they need 768MB more memory and an extra chip on their logic boards to support power saving tech, if they want to use Tegra. I'm sure Nvidia will sell it good though, they're really adept at getting people to play follow-the-leader :thumbsup:

Why would cell phone manufacturers want a technology like this? They don't even abide by VESA standards, which means the screens don't have VBLANK anyways. Cell phone manufacturers would have to come up with a completely new way of handling it, neither Gsync nor FreeSync would work.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
With G-sync, the monitor polls. It is constantly asking the GPU once every 1-3ms (not sure specific times, but that is how much latency it is said to have). If the GPU has a finished frame, then the monitor goes into vertical blanking mode and updates the image.

That's incorrect, it's the other way around:

After the GPU renders a frame with G-Sync enabled, it will start polling the display to see if it’s in a VBLANK period or not to ensure that the GPU won’t scan in the middle of a scan out.

So, why does the GPU need this polling in the first place? If the frames just needs to be pushed to the controller to be displayed, why this polling?

But the interesting question is: What does the G-Sync controller do that can't be done on the GPU?

We need to wait for more information about FreeSync, but as of now, both seem similar to me. I'm not saying that FreeSync is better then G-Sync, or that G-Sync is a bad implementation, no not at all.