VulgarDisplay
Diamond Member
- Apr 3, 2009
- 6,188
- 2
- 76
And you're basing that statement on what, exactly?
I'd like to know that's for sure.
And you're basing that statement on what, exactly?
One of the laptops was using traditional vsync, only refreshing the display at a fixed rate, and the quantization effect of the fixed refresh cycle introduced obvious roughness into the animation. On the other laptop, however, the motion was much smoother, with no apparent tearing or slowdowns—much like you'd see from Nvidia's G-Sync technology.
...
Regardless, the good news here is that AMD believes a very effective G-Sync-like variable refresh technology shouldn't add any cost at all to a display or system. Koduri says it "should become a free thing," and the term "free sync" is already being spoken as shorthand for this technology at AMD.
If I understand this correctly
Source: http://www.neogaf.com/forum/showthread.php?t=699523So no matter how fast or slow the GPU is rendering frames, the monitor is never locked to a particular refresh rate. Not stuck at 60Hz, even if the GPU is stuck on 45fps. In this case, the monitor would change to 45Hz to match the framerate. And if the GPU suddenly boosts to 110fps? The monitor boosts to 110Hz too.
And you're basing that statement on what, exactly?
TechReport have also gone hands on with it:
One of the laptops was using traditional vsync, only refreshing the display at a fixed rate, and the quantization effect of the fixed refresh cycle introduced obvious roughness into the animation. On the other laptop, however, the motion was much smoother, with no apparent tearing or slowdownsmuch like you'd see from Nvidia's G-Sync technology.
...
Regardless, the good news here is that AMD believes a very effective G-Sync-like variable refresh technology shouldn't add any cost at all to a display or system. Koduri says it "should become a free thing," and the term "free sync" is already being spoken as shorthand for this technology at AMD.
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech
Please don't forget that even if the screens only need a firmware update to support this, a lot of screens only have a limited range for vertical frequency. My first screen at work only supports 56-76 Hz and my second 50-76 Hz, so FreeSync would not work in the 40ish frames.
That's interesting. It does need to reach lower than that, IMO, to be truly useful. Although that might be part of the reason nVidia is limiting GSync to 120/144 Hz monitors. In nVidia's presentation they said GSync worked down to 30Hz. That could well have been the limitation of the monitor causing that. The other thing AMD said was you could only limit the VBLANK for as long as the monitor's color would hold up. That would be another limit on how slow the refresh rate could be adjusted.
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-techIn Koduri's assessment, it's possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and triple buffering.
He thinks Nvidia's G-Sync hardware is doing both of these things in order to achieve the results it does, but he initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware. After all, triple-buffering can be implemented by a game developer in software or even enabled via a software switch in a graphics driver control panel. Koduri said AMD used to have an option to force the use of triple buffering in its driver control panel, in fact, and would be willing to consider bringing it back.
Does it matter what it is as long as it achieves smooth animations?
Weird stance, considering Nvidia still fully supports all the people who went out and bought expensive hardware just to run their 3D Vision.
Yes, because there is a difference between G-Sync and V-Sync.
There’s apparently already a VESA standard for controlling VBLANK intervals. The GPU’s display engine needs to support it, as do the panel and display hardware itself. If all of the components support this spec however, then you can get what appears to be the equivalent of G-Sync without any extra hardware.
The next step was to write a little demo app that could show it working. In the video below both systems have V-Sync enabled, but the machine on the right is taking advantage of variable VBLANK intervals. Just like I did in our G-Sync review, I took a 720p60 video of both screens and slowed it down to make it easier to see the stuttering you get with V-Sync On when your content has a variable frame rate.
just watched the AT video of it:
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
youtube:
http://www.youtube.com/watch?v=pIp6mbabQeM
Looks cool.
Again, as long as it works (which from that demo, it certainly does) who cares.. especially if its FREE too.
the main thing is if it can adjust the refresh rate without adding more latency I guess
+1
G-Sync should be faster because it has dedicated hardware.
A total lack of information provided by AMD.
- Requires driver to predict the future.
- Uses V-Sync
- G-Sync drives VBI via hardware; Freesync speculates VBI
What information are you using to refute my thoughts?
Simply put, it means that if we don't know that something exists, it doesn't mean that it doesn't; It only means we don't know one way or the other, we just haven't been made aware of it yet so it's not part of our knowledge.
A video demontration showing that it works.What information are you using to refute my thoughts?
The video didnt show that it works.
It showed a static frame rate with V-Sync.
nVidia showed a variable frame rate without V-Sync in montreal and a moving scene. AMD even compared a 30hz v-sync setting to their "FreeSync" which doesn't make any sense.
So, pls show us a proof that FreeSync is able to deliver AMD's promises.
Or conversely G-Sync could be slower because it has to go through yet another hardware stage in the display pipeline. See, speculation goes both ways.![]()
The only people who would want to see AMD fail are NV shareholders. Because gamers would want fierce competition from both sides, drives innovation and lower prices for all.
