[Fudzilla] BenQ and Viewsonic F[r]eeSync monitors in time for holidays

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
You are missing the point. If Nvidia chooses not to support A-sync, then they effectively make A-sync an AMD only option, unless you plan to buy an Intel discrete GPU.

It doesn't matter if A-sync's tech is open, it has to be supported by Nvidia to matter.

Intel doesnt support A-Sync either. And the ball seems to be at Intel.

Intel singlehanded decides the fate of A-Sync due to marketshare. If they dont support it, A-Sync will be another G-Sync market wise. If they support it, nVidia will be forced as well.

But there is no way Intel will support A-Sync before Cannonlake, assuming the decision is already made to support it.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
To me it matters a great deal that FreeSync is an open standard and GSync is closed/proprietary.

Please don't mix things. FreeSync is closed/proprietary just like G-Sync.

So, AMD's FreeSync is locked to AMD hardware.

The open part is VESA's Adaptive-Sync and it still requires a software to be used. We will have devs that will create software for controlling A-Sync monitors, no doubt, but it will require some time.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
No, I think the primary reason why G-Sync has failed to take off is because Average Joe won't see it's value. They think that the tearing, stuttering mess they have with V-Sync off is perfectly fine (This might be an enthusiast forum, but how many times have you seen posts like "I don't have any tearing because I use a 120fps monitor", etc?). In this state of blissful ignorance, G-Sync doesn't seem all that compelling. I'm sure that most of those people would still realize what a game changer G-Sync is if they actually saw it in the flesh, but that's the main obstacle; without actually seeing it, it's difficult for the layman to understand. You obviously can't demonstrate it accurately in videos. In fact, attempts to do this have likely backfired, and turned away some potential G-Sync adopters (for example, unbeknownst to Tom Petersen, the 'smooth motion' G-Sync simulation broadcast on the PCPer G-Sync introduction stream was full of stutter from the stream itself, and I can just imagine the people sitting at home watching it thinking "what's the difference"?)

You don't think the ridiculous cost of it has something to do with it?
Average Joe who spends <$300 on their gpu is not going to be interested in an $800+ TN monitor no matter how compelling the tech inside it might be. On my crappy twitch stream, I could see what g-sync was doing, especially with their pendulum demo. Anyone who has heard about g-sync has some appreciation and understanding of what it's about.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
No, I think the primary reason why G-Sync has failed to take off is because Average Joe won't see it's value. They think that the tearing, stuttering mess they have with V-Sync off is perfectly fine (This might be an enthusiast forum, but how many times have you seen posts like "I don't have any tearing because I use a 120fps monitor", etc?). In this state of blissful ignorance, G-Sync doesn't seem all that compelling. I'm sure that most of those people would still realize what a game changer G-Sync is if they actually saw it in the flesh, but that's the main obstacle; without actually seeing it, it's difficult for the layman to understand. You obviously can't demonstrate it accurately in videos. In fact, attempts to do this have likely backfired, and turned away some potential G-Sync adopters (for example, unbeknownst to Tom Petersen, the 'smooth motion' G-Sync simulation broadcast on the PCPer G-Sync introduction stream was full of stutter from the stream itself, and I can just imagine the people sitting at home watching it thinking "what's the difference"?)

I think you are completely spot on. G-Sync and A-Syncs biggest enemy is the lack of personal demonstration.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Freesync was alledgedly coming a year ago.

No it wasn't, AMD said that it would take 6-12 months back in May (source). So that would be November 2014 at the earliest (obviously not going to happen) and May 2015 at the latest.
 
Last edited:

Spanners

Senior member
Mar 16, 2014
325
1
0
Freesync was alledgedly coming a year ago, and released just when NV released G-Sync to muddy the waters.
I guess while NV users may pay more, at least they have enjoyed the tech for a good period already.
I personally think its a bit out of order to claim a better product with no overhead when it still doesnt have a consumer monitor or a professional review.

To muddy the waters? What do you think this is? AMD obviously released the information they had at the time in an attempt to counter G-Sync, why wouldn't they? This is competition after all. Did you expect them to not mention anything in response until they had a fully working product in the hands of reviewers? They never lied and said don't buy G-Sync our product will be here tomorrow.

Also who was claiming it was a better product? It seems to have some cost advantages but we'll have to wait and see.

No it wasn't, AMD said that it would take 6-12 months back in May (source). So that would the be November 2014 at the earliest (obviously not going to happen) and May 2015 at the latest.

Thanks, so I does seem to be on-track for release within that time-frame.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
TTR just put up this article a few hours ago:

FreeSync displays have entered the mass production and validation stage, AMD tells us. The first commercial offerings are now scheduled for a launch in the January-February time frame. We'll still have to wait until March for Samsung's FreeSync-enabled UHD monitors, though.
I'm guessing we'll have at least some fresh details early next month, since AMD and its partners plan to have FreeSync monitors on display at CES.
http://techreport.com/news/27484/freesync-monitors-hit-mass-production-coming-in-jan-feb
 

Spanners

Senior member
Mar 16, 2014
325
1
0
Looks like the Novemeber rumor was wrong, looks like the "before holiday season" rumor was wrong.

Let's hope the Q1 rumor isn't wrong...



AMD did. Let me see if I can find the slide...

http://i.imgur.com/zrLbb6H.jpg

Well, rumors will be rumors.

Besides the blindingly obvious fact that AMD would be trying to promote their own upcoming product that doesn't state anything like "it's a better product". It's just some feature checks that they again obviously put together to show Freesync in a positive light.
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
Oh, so comparing your product to a competitor's product is not a way to try to prove it's better?

Am I missing something?

Don't get me wrong, I have no problem with AMD trying to promote their product. Just showing you where SolMiester is coming from.
 
Last edited:

Spanners

Senior member
Mar 16, 2014
325
1
0
Oh, so comparing your product to a competitor's product is not a way to try to prove it's better?

Am I missing something?

Don't get me wrong, I have no problem with AMD trying to promote their product. Just showing you where SolMiester is coming from.

I get what you're saying but I still don't understand what you and Sol expect. Do you think AMD would come out with an objective assessment of the strengths and weaknesses of Freesync vs G-Sync? That's the reviewers job. I'd assumed he meant someone here had said it was a better product.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Also who was claiming it was a better product? It seems to have some cost advantages but we'll have to wait and see.

The blatant lie was: Gsync introduces lag, FreeSync eliminates lag.

And it came from AMD Gaming Scientist Richard Huddy.

But the guy is obviously out-of-control, so I'm not sure we should hold AMD responsible for his ramblings.
Although AMD did have similarly ridiculous comparison(like the one vs Gsync) with Raptr vs GeForce Experience.

But hey, at least they put the disclaimer at the end:
Peter Ross is a Senior Product Marketing Manager at AMD. His postings are his own opinions and may not represent AMD&#8217;s positions, strategies or opinions.

So comparing that Facebook/Whatsup social networking pile Raptr vs genuinely useful and unobtrusive GFE/Shadowplay is him alone, not AMD.
The guy is only Senior Product Marketing Manager. ;)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Because FreeSync is part of the industry standard port and interface. If Nvidia wants to be 100% compliant with that standard, they will support it. It just might take them a generation to realize that GSync is dead. Nvidia is a very arrogant company, they don't like to eat crow.

Look at HD3D and 3D Vision. HD3D is an open standard that AMD came up with a little bit after 3D Vision was made. Nvidia never supported HD3D.

If you buy an HD3D monitor, you have to buy AMD GPU's to use it. If you buy a 3D Vision monitor, you have to buy Nvidia to use the 3D features.

It has nothing to do with arrogance. It's simply about what they feel is the best way to make money. If they think competing against AMD can make them more money, they may choose to do so. If they feel they'll make more money supporting the open standard, they will.

Just because a standard exists, does not mean it has to be compliant with it. It may be tough if there are lots of players, but if there is only 1 (for discrete cards), it is up to them.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
The blatant lie was: Gsync introduces lag, FreeSync eliminates lag.

Technically that isn't a lie at all. G-SYNC does introduce lag (1 ms to be exact), due to polling of the display. Freesync apparently doesn't suffer from this (and Nvidia is also working on eliminating it)

The thing you're thinking of is probably Huddy alleging that G-SYNC utilized triple-buffering (which would potentially entail an entire frames worth of lag).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Technically that isn't a lie at all. G-SYNC does introduce lag (1 ms to be exact), due to polling of the display. Freesync apparently doesn't suffer from this (and Nvidia is also working on eliminating it)

The thing you're thinking of is probably Huddy alleging that G-SYNC utilized triple-buffering (which would potentially entail an entire frames worth of lag).

Triple buffering doesn't cause lag, though if it was required, the reasons for needing it often do, but it was AMD who initially thought they'd need it, not Nvidia.

Anyways, the only frame worth of lag that G-sync gets hit with, is when you reach about 140 FPS (a little below the refresh rate cap), which is the same deal with V-sync when you reach the refresh rate cap.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Triple buffering doesn't cause lag, though if it was required, it does make it sound like it could, but it was AMD who initially thought they'd need it, not Nvidia.

Anyways, the only frame work of lag that G-sync gets hit with, is when you reach about 140 FPS, which is the same deal with V-sync when you reach the refresh rate cap.

Depends on the exact implementation of triple buffering i.e. a triple buffered render ahead queue, will generally introduce 1 frames worth of lag (although to be fair a render ahead queue is not really "true" triple buffering as term is most commonly used).
 

tential

Diamond Member
May 13, 2008
7,348
642
121
No, I think the primary reason why G-Sync has failed to take off is because Average Joe won't see it's value. They think that the tearing, stuttering mess they have with V-Sync off is perfectly fine (This might be an enthusiast forum, but how many times have you seen posts like "I don't have any tearing because I use a 120fps monitor", etc?). In this state of blissful ignorance, G-Sync doesn't seem all that compelling. I'm sure that most of those people would still realize what a game changer G-Sync is if they actually saw it in the flesh, but that's the main obstacle; without actually seeing it, it's difficult for the layman to understand. You obviously can't demonstrate it accurately in videos. In fact, attempts to do this have likely backfired, and turned away some potential G-Sync adopters (for example, unbeknownst to Tom Petersen, the 'smooth motion' G-Sync simulation broadcast on the PCPer G-Sync introduction stream was full of stutter from the stream itself, and I can just imagine the people sitting at home watching it thinking "what's the difference"?)

I wasn't paying an Nvidia premium combined with a monitor with Gsync premium.
That was ridiculous.

But ontop of that, I'm not interested in Freesync/Gsync until it hits a large panel like 70 inches+. I'm hoping someone will do it though. Maybe Seiki?

Anyone... Please... Don't forget about me :(
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Depends on the exact implementation of triple buffering i.e. a triple buffered render ahead queue, will generally introduce 1 frames worth of lag (although to be fair a render ahead queue is not really "true" triple buffering as term is most commonly used).

Like I said, triple buffering does not add latency, but the reasons to use it, usually do.
 

_UP_

Member
Feb 17, 2013
144
11
81
I am waiting for FreeSync. Honestly I think GSync is too expensive. I also don't like the thought of locking myself. I switch cards often, and switch between Nvidia and AMD all the time. That said, for now, FreeSync is just as locked as GSync.
I think ShintaiDK was right on the money on both accounts - only one thing will make FreeSync win and that is Intel. If they choose to support, as the largest GPU manufacturer (a bit old but still true) they have the power to change things. Unfortunately, as ShintaiDK mentioned, Intel would take a while to do so - as a very large company, they don't adapt as fast as they plan a long time in advance.
On that note, I think AMD played its cards wrong when they didn't let Intel on Mantle. It is a double mistake in my book. They say that they are all about open standards and they usually do, and they say that Mantle is still in beta, but it has been a long time and they should have let Intel in on it - it would hurt their (already losing) CPU side but has the potential of helping big with the GPU side.
I truly hope they won't do this mistake with FreeSync - that is if it is an open standard and not an implementation of ASync that is locked to AMD.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Unfortunately, as ShintaiDK mentioned, Intel would take a while to do so - as a very large company, they don't adapt as fast as they plan a long time in advance.

Its not because they dont adapt fast, its because its placed on a CPU and CPUs take a long time to design and validate. Cannonlake for example is already in validation progress and as such a finished design.
 

_UP_

Member
Feb 17, 2013
144
11
81
Its not because they dont adapt fast, its because its placed on a CPU and CPUs take a long time to design and validate. Cannonlake for example is already in validation progress and as such a finished design.

That is very true. It is also the slow part though. The bigger a company is, usually, the slower it can change things - you have more bureaucracy and managers to go through to approve of something. But you are very much right about the long process of CPU development and validation.
I do wonder though, how much hardware is needed to implement ASync and if Intel needs to physically add something or they can "unleash" it with a FW update of some sort (or could in the future or a future but validated CPU, such as Cannonlake). I have no idea about this and would assume it does require some sort of HW component.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I am waiting for FreeSync. Honestly I think GSync is too expensive. I also don't like the thought of locking myself. I switch cards often, and switch between Nvidia and AMD all the time. That said, for now, FreeSync is just as locked as GSync.
I think ShintaiDK was right on the money on both accounts - only one thing will make FreeSync win and that is Intel. If they choose to support, as the largest GPU manufacturer (a bit old but still true) they have the power to change things. Unfortunately, as ShintaiDK mentioned, Intel would take a while to do so - as a very large company, they don't adapt as fast as they plan a long time in advance.
On that note, I think AMD played its cards wrong when they didn't let Intel on Mantle. It is a double mistake in my book. They say that they are all about open standards and they usually do, and they say that Mantle is still in beta, but it has been a long time and they should have let Intel in on it - it would hurt their (already losing) CPU side but has the potential of helping big with the GPU side.
I truly hope they won't do this mistake with FreeSync - that is if it is an open standard and not an implementation of ASync that is locked to AMD.

How would allowing intel to help with mantle hurt AMD on the CPU side?

Mantle removes CPU overhead. No matter what, it will benefit the weaker processor more.
 

_UP_

Member
Feb 17, 2013
144
11
81
How would allowing intel to help with mantle hurt AMD on the CPU side?

Mantle removes CPU overhead. No matter what, it will benefit the weaker processor more.

It will still help any CPU. Intel sells more of these. Right now (or soon), AMD will have Mantle to write home about and Intel won't. That (I'm sure they're hoping) will help tip some people to but AMD CPUs. If Intel can also have Mantle that reduces the advantage AMD will get from it, and as Intel CPUs are better and faster (not to mention the market share and brand recognition), it would make them have an even bigger advantage over AMD CPUs.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
It would still only be a benefit for AMD. More Mantle=more useful AMD CPUs plus more Mantle games with Intel as backer.

So the issue may rather be that Mantle is more tired to GCN than told.