All Samsung's Ultra HD monitors in 2015 to support FreeSync

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
as much as I am a fan of underdogs, Nvidia's solutions are almost always more thought out and more well implemented. Their business strategy is also superior, for example requiring an Nvidia chip in monitors to support the GSync

Well, if it's just faith in nVidia then that's fine for you to believe. It of course doesn't make it so.

Still curious if f1sherman has a more objective reason.
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
If was to speculate, I'd say that in their current implementations G-Sync dominates FreeSync,
with almost as good still remaining a possibility in the mid-short term.

Yes maybe, we have to wait for the first tests.

From the PR side (yes, yes, I know :sneaky:), FreeSync should work exactly like G-Sync:

1416475155950856.jpg


dsc00452.jpg


There was a slide on Richard Huddy's presentation for PDXLAN that showed a graph (yes, again here :sneaky:) on how FreeSync works over time. Sadly, the video is now down ...


Nvidia does not, and most likely will never support FreeSync.
Nvidia may one day create their own FreeSync-like solution built on top of VESA's Adaptive-Sync.

I think that most people here that say that nVidia will support FreeSync are actually referring to Adaptive-Sync ...

I would not be surprised if nVidia releases an updates to G-Sync that support A-Sync Monitors just around CeBit 2015.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I have a UD590. How would I go about using freesync with my 290x?

Monitors that support Freesync are not out yet. You'll likely have to replace your monitor for Freesync. Unless they offer some type of add in module like nVidia did for GSync. Drivers are supposed to be released by AMD in Dec that support Freesync (That's likely when we'll see any reviews start.). Until then there is no Freesync capability on the cards even.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
Monitors that support Freesync are not out yet. You'll likely have to replace your monitor for Freesync. Unless they offer some type of add in module like nVidia did for GSync. Drivers are supposed to be released by AMD in Dec that support Freesync (That's likely when we'll see any reviews start.). Until then there is no Freesync capability on the cards even.

Not according to the article I've read. According to it my UD590 has the capability. But I'll find the information else where.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Very bold claims! Proof?
If you understand how displayport and g-sync work, it's pretty obvious what the GPU needs to do. See below.


FWIU there's hardware required on the videocard that is lacking on nVidia. That's why they have to use the complex Gsync mudule mounted in the monitor.
Source?

Displayport is just data. All the hardware to get the information to and from the monitor existed before G-sync. The hardware required to push a frame out with arbitrary timing exists in nvidia cards since g-sync. The hardware required to buffer a frame on the GPU so that it can be re-sent has been there since v-sync was invented. What's left?
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Source? What exactly does the GPU need to do for adaptive sync that wouldn't need to be done for g-sync or v-sync?

We are still waiting on your source, in terms of the claim that nVidia can just add it in existing hardware. If it was so easy, I am sure all AMD cards would have supported it. Rather than a smaller selection.

The hardware simply doesnt support DP1.2a output.
 

SoulWager

Member
Jan 23, 2013
155
0
71
We are still waiting on your source, in terms of the claim that nVidia can just add it in existing hardware. If it was so easy, I am sure all AMD cards would have supported it. Rather than a smaller selection.

The hardware simply doesnt support DP1.2a output.

Yes it does, 1.2a existed long before adaptive-sync was added to it as an option. All you need to support "1.2a" is HBR2 or MST compatibility. http://www.vesa.org/wp-content/uploads/2013/06/DisplayPort-Marketing-Guidelines-v1_1.pdf
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
Displayport is just data. All the hardware to get the information to and from the monitor existed before G-sync. The hardware required to push a frame out with arbitrary timing exists in nvidia cards since g-sync. The hardware required to buffer a frame on the GPU so that it can be re-sent has been there since v-sync was invented. What's left?

You are right, but maybe there's a timing issue.

Things like G-Sync/A-Sync require a extremely precise control over the DP data channel. Maybe older hardware lack this fine-tuning needed because of simplifications.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yes it does, 1.2a existed long before adaptive-sync was added to it as an option. All you need to support "1.2a" is HBR2 or MST compatibility. http://www.vesa.org/wp-content/uploads/2013/06/DisplayPort-Marketing-Guidelines-v1_1.pdf

You havent proved in any way the hardware supports DP1.2a. Also 1.2a is an optional spec. Even DP1.3 is gone today, zero cards. And we only have 2 capable HDMI 2.0 cards after 1 year.

Also if it was so easy. Then I am sure Intel would have announced support for it as well. Yet they havent, tho it can be very useful in movie playback. The simple fact is we have to wait for proper hardware support, even if they choose to implement this optional spec.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
You are right, but maybe there's a timing issue.

Things like G-Sync/A-Sync require a extremely precise control over the DP data channel. Maybe older hardware lack this fine-tuning needed because of simplifications.
I'm not saying it would be "free" for Nvidia to support adaptive-sync, because they'd have to do a lot of small driver tweaks to get it to work optimally.

A few examples:

Use triple buffering while scanning out a repeated frame, but double buffering when not repeating a frame.

Repeat a frame early if the last one was close to the threshold for a forced refresh, so that you are more likely to refresh the screen the instant the new frame is done rendering.

For a bit of extra smoothness(at the cost of a bit of latency), figure out the animation interval for a frame, and use that to inform the frame pacing algorithm, so a frame that covers 12ms of animation followed by a frame that takes 10ms to render would be shown on screen for 12ms instead of 10ms. (maybe slightly less, to keep latency from ballooning, but you get the idea).


This stuff is what a lot of you are assuming AMD gets right when you say freesync will beat g-sync. Bottom line is we have to see it working before we can make that call.
 

SoulWager

Member
Jan 23, 2013
155
0
71
You havent proved in any way the hardware supports DP1.2a. Also 1.2a is an optional spec. Even DP1.3 is gone today, zero cards. And we only have 2 capable HDMI 2.0 cards after 1 year.

DP1.2a is DP1.2. There is no difference. Adaptive-sync has been an optional standard in DP1.2 since earlier this year, and if the eDP spec it's "based" on is any indication, all it covers is the configuration data for the monitor to tell the GPU it supports the feature.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
DP1.2a is DP1.2. There is no difference. Adaptive-sync has been an optional standard in DP1.2 since earlier this year, and if the eDP spec it's "based" on is any indication, all it covers is the configuration data for the monitor to tell the GPU it supports the feature.

You still havent proved anything. You claim the hardware supports it. Yet we need new cards, new monitors. There is no magic software/firmware fix (Unless you own a small selected group of AMD cards.).

Its simply an optinal feature that isnt implemented in hardware on the desktop.
 

SoulWager

Member
Jan 23, 2013
155
0
71
You still havent proved anything. You claim the hardware supports it. Yet we need new cards, new monitors. There is no magic software/firmware fix (Unless you own a small selected group of AMD cards.).

Its simply an optinal feature that isnt implemented in hardware on the desktop.
If Nvidia didn't have the hardware necessary for variable refresh, we wouldn't have g-sync.

You REALLY need to understand that Displayport is just data. You don't need a hardware change just to change a couple bits of configuration data.
 
Feb 19, 2009
10,457
10
76
If Nvidia didn't have the hardware necessary for variable refresh, we wouldn't have g-sync.

According to NV, they don't. That's why they had to design and create a dedicated hardware module for G-Sync.

But you seem to know more than NV, go on, tell them your wondrous advice of doing it via a "driver tweak".
 

SoulWager

Member
Jan 23, 2013
155
0
71
According to NV, they don't. That's why they had to design and create a dedicated hardware module for G-Sync.

But you seem to know more than NV, go on, tell them your wondrous advice of doing it via a "driver tweak".

It's the monitors that didn't support variable refresh, hence new hardware in the display, and the release date for the first announced freesync monitors being in march, about a year and a half after the initial g-sync announcement.

The thing about variable refresh is it's not particularly difficult to implement, but it has to be implemented at every single step of the chain between the game engine and the panel.

Nvidia said they weren't planning on supporting adaptive-sync, but they never said it was for technical reasons. The business reasons are a lot more compelling.
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
It's the monitors that didn't support variable refresh, hence new hardware in the panel, and the release date for the first announced freesync monitors being in march, about a year and a half after the initial g-sync announcement.

The thing about variable refresh is it's not particularly difficult to implement, but it has to be implemented at every single step of the chain between the game engine and the panel.

Nvidia said they weren't planning on supporting adaptive-sync, but they never said it was for technical reasons. The business reasons are a lot more compelling.

Exactly!

I've tried to explain this over and over again here, but some people just don't want to understand.

Nice to have somebody that shares the same idea. :thumbsup:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It's the monitors that didn't support variable refresh, hence new hardware in the display, and the release date for the first announced freesync monitors being in march, about a year and a half after the initial g-sync announcement.

The thing about variable refresh is it's not particularly difficult to implement, but it has to be implemented at every single step of the chain between the game engine and the panel.

Nvidia said they weren't planning on supporting adaptive-sync, but they never said it was for technical reasons. The business reasons are a lot more compelling.

So why dont all AMD cards support it? I mean arcording to you its so easy and just by a driver. Because you already decided thats how nVidia limited it. And dont tell me its a hardware reason for the AMD cards that dont support it, while you claim nVidia cards can without any kind of documentation.
 

SoulWager

Member
Jan 23, 2013
155
0
71
So why dont all AMD cards support it? I mean arcording to you its so easy and just by a driver. Because you already decided thats how nVidia limited it. And dont tell me its a hardware reason for the AMD cards that dont support it, while you claim nVidia cards can.
Those AMD GPUs don't support variable refresh, you'd have to ask AMD if it's a hardware reason or if they just don't care about the customers with older hardware. Those older cards do, however, support adaptive-sync, they just can't do much with it(they can use adaptive-sync to play content at whatever fixed refresh rate it was recorded at).

G-sync compatible cards DO support variable refresh. All of them. They just don't support it via adaptive-sync. That could change if Nvidia decides it makes business sense to support it. They'd be trading g-sync monitor sales to people with nvidia GPUs for GPU sales to people that already have adaptive-sync monitors, and this only makes business sense if there are a lot of people with adaptive sync monitors looking for new GPUs.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Those AMD GPUs don't support variable refresh, you'd have to ask AMD if it's a hardware reason or if they just don't care about the customers with older hardware. Those older cards do, however, support adaptive-sync, they just can't do much with it(they can use adaptive-sync to play content at whatever fixed refresh rate it was recorded at).

G-sync compatible cards DO support variable refresh. All of them. They just don't support it via adaptive-sync. That could change if

Yet you claim nVidia can just do it with some driver. You still havent proven anything. And when looking on AMD. Its quite obvious that it requires more than just a driver.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
How can you say that without documentation? (see what I did here?)

If it wasnt, freesync would work on g-sync monitors.

I dont get where you and SoulWager got the illusion that its just some driver tweak that is needed. Specially when AMD themselves only support it on a limited set of GPUs.

AMD uses VBLANK directly for example.

While nVidia uses the G-Sync module to modify the VBLANK.
 
Last edited:
Status
Not open for further replies.