[AMD] World's First Shipping FreeSync-Enabled Displays (CES)

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yea and they sucked so what's the point? Not to mention you're a slave to Nvidia if you plan on taking advantage of the monitor, good for the Nvidia uber alles I guess. But the smart consumer waits for better choices, especially considering a monitor is something you're going to be using through many GPU upgrades.

Until things change, if they do at all, if you get a Freesync monitor, you are a slave to AMD.

I'm personally just waiting to see which direction I'll go. AMD does look to have some promising options coming. It is wise to take a wait and see approach. I'm due for a GPU upgrade as well, which is going to make the next GPU generation part of my decision.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Until things change, if they do at all, if you get a Freesync monitor, you are a slave to AMD.
This isn't the doing of AMD. Nvidia not supporting something that is going to be standard on most if not all monitors going forward would be an epic failing on the part of Nvidia.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
This isn't the doing of AMD. Nvidia not supporting something that is going to be standard on most if not all monitors going forward would be an epic failing on the part of Nvidia.

It is their choice, like it or not. There are only 2 high end GPU choices, so if they don't support it, it leaves everyone in the same boat regardless of the brand you use. It'll likely take a noticeable loss in sales for them to give up on their own G-sync modules.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
AMD was much more fast than Nvidia since Tom from Nvidia said in a intervew that when they announced Gsync they alredy was 2 years in the development.
Who cares who is the champ of rushy development ?
I want to buy a product.

So in 1 year AMD is doing the same as Nvidia in 3 years.
As far as I have seen, most G-Sync monitors also support ULMB.
And none of the FreeSync monitors support ULMB or similar functionality.
When I will spend money on a gaming-monitor, I do not only want G-Sync/FreeSync, I also want ULMB. I don't see AMD offering that any time soon.

(Only G-Sync monitors with rather high resolutions don't support ULMB. Because you can't do 120Hz+ refresh at those resolutions with current cables).
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Who cares who is the champ of rushy development ?
I want to buy a product.


As far as I have seen, most G-Sync monitors also support ULMB.
And none of the FreeSync monitors support ULMB or similar functionality.

When I will spend money on a gaming-monitor, I do not only want G-Sync/FreeSync, I also want ULMB. I don't see AMD offering that any time soon.

(Only G-Sync monitors with rather high resolutions don't support ULMB. Because you can't do 120Hz+ refresh at those resolutions with current cables).

Ignorance is not an excuse to make ridiculous claims

http://www.displaylag.com/ces-2015-benq-unveils-new-xl-rl-design-monitors/
The main attraction for BenQ’s new gaming monitors is the XL2730Z. This monitor supports a maximum refresh rate of 144hz, and features the bells and whistles you’d come to expect from a serious gaming monitor: 1ms GTG response time, TN panel, and low input latency. The main selling point of this monitor is the support of Adaptive-Sync technology, which is AMD’s answer to Nvidia’s G-SYNC module. For those unaware, G-SYNC and FreeSync are an absolute must for PC gaming, replacing vertical sync to provide stutter-free gaming experiences, without the drawbacks. The XL2730Z features a WQHD 2560 x 1440 resolution, which is quickly replacing 1080p as the standard resolution in higher-end monitors. It features the redesigned S. Switch that was first implemented in the XL2430T: a circular device that sits on the base of the monitor’s stand to allow faster control of the OSD. A host of inputs are available too, including dual HDMI inputs, D-Sub, DVI, and DisplayPort. It’ll be releasing quite soon too, expected to hit retail around late January-early February.
http://www.benq.us/news/2014/BenQ-Announces-New-Product-Lineup-for-the-2015-International-CES
BenQ’s 27-inch XL2730Z monitor ensures gamers stay on the leaderboard by providing a WQHD 2560 x 1440 gateway into the fast-action world of gaming. Equipped with Gaming Refresh-rate Optimization Management (GROM), gamers gain the freedom to custom-build their personal gaming experience by tweaking viewing preferences such as refresh rates, display resolutions, and screen sizes. For even greater gaming comfort, the XL2730Z features BenQ’s RevolutionEyes™ technology for exceptional monitor performance by eliminating backlight flickering at all brightness levels so that gamers can engage in longer playing sessions. In addition, the monitor’s low blue light technology manages the exposure of blue spectrum light emitted by computer screens to further contribute to more comfortable viewing. The XL2730Z alsocomes equipped with 144Hz refresh rate, 1ms GTG response time, adaptive sync function, as well as BenQ’s Black Equalizer, Motion Blur Reduction 2.0, Auto Game, and Game to Go Modes.
http://www.blurbusters.com/benq-xl2720z-another-official-motion-blur-eliminating-strobe-backlight/

Hot on heels of G-SYNC official strobe mode, and EIZO’s Turbo240 official strobe mode, BENQ announces the XL2720Z (Z-suffix) with the Motion Blur Reduction feature, which Blur Busters confirms is another high-efficiency LightBoost-style strobe backlight, official and easily enabled in monitor menus.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
This isn't the doing of AMD. Nvidia not supporting something that is going to be standard on most if not all monitors going forward would be an epic failing on the part of Nvidia.

If I purchase a Freesync monitor, how many GPU brands work with this monitor?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
You can use it as a monitor with any brand but FreeSync will only work with AMD GPUs and you will be locked in.

So why do I still keep hearing that FreeSync doesn't lock me into a brand if it only works with AMD GPUs?
 
Feb 19, 2009
10,457
10
76
So why do I still keep hearing that FreeSync doesn't lock me into a brand if it only works with AMD GPUs?

Technically, waffleironhead is correct.

It will work on all GPUs. The monitor itself works just fine like any other, except you won't get FS unless you have some AMD GPUs.

Now, if the cost is the same as non FS models, you lose nothing.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
So why do I still keep hearing that FreeSync doesn't lock me into a brand if it only works with AMD GPUs?

Which of these statements below are true?

- VESA's Adaptive Sync is part of the Display Port feature and an industry standard

- Freesync is AMD's way to utilize VESA's Adaptive Sync.

- Since it's an industry standard, Intel and NVIDIA are free to come up with their own way to utilize Adaptive Sync

- G-Sync on the other hand is NVIDIA's proprietary technology

- If you buy a G-Sync monitor you can use the G-Sync feature in it only with an NVIDIA GPU
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Technically, waffleironhead is correct.

It will work on all GPUs. The monitor itself works just fine like any other, except you won't get FS unless you have some AMD GPUs.

Now, if the cost is the same as non FS models, you lose nothing.

Gsync monitors work with AMD GPUs, except without the GSync feature.

I'm failing to see your point other than "Price" Which I've made no comments about so lets stop coming back to "Price Price Price" and instead address the question I'm actually asking:

How does the Freesync feature work with other GPUs other than AMD GPUs?
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Gsync monitors work with AMD GPUs, except without the GSync feature.

Except there's no chance you can use it with other than NVIDIA GPU.

If you buy and Adaptive Sync monitor, there's a chance you can use it with Intel, or NVIDIA GPU should either one choose to adopt it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Except there's no chance you can use it with other than NVIDIA GPU.

If you buy and Adaptive Sync monitor, there's a chance you can use it with Intel, or NVIDIA GPU should either one choose to adopt it.

There's a chance? Based on what, your crystal ball? :rolleyes: Stop posting misleading statements. Nobody besides AMD uses adaptive sync/free sync and those are the facts.
 
Feb 19, 2009
10,457
10
76
There's a chance? Based on what, your crystal ball? :rolleyes: Stop posting misleading statements. Nobody besides AMD uses adaptive sync/free sync and those are the facts.

I thought the technology has been around awhile in notebooks & tablets. Variable refresh rate that is, the purpose of which is to save power, since the screen only updates when it needs to and doesn't have to run full 60 hz constantly.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
I thought the technology has been around awhile in notebooks & tablets. Variable refresh rate that is, the purpose of which is to save power, since the screen only updates when it needs to and doesn't have to run full 60 hz constantly.

Variable refresh rate for power saving is hardly the same thing as adaptive sync where the GPU controls the refresh rate in 3D applications. But you already knew that and are trying to act clever.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
If Intel decides to adopt G-Sync will they need to pay royalties to Nvidia? Since the module is manufactured by Nvidia?
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
There's a chance? Based on what, your crystal ball? :rolleyes: Stop posting misleading statements. Nobody besides AMD uses adaptive sync/free sync and those are the facts.

I supposed you have a crystal ball since you know NVDIA will not adopt Adaptive Sync?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This isn't the doing of AMD. Nvidia not supporting something that is going to be standard on most if not all monitors going forward would be an epic failing on the part of Nvidia.

What the pro-GSync supporters still can't get through their head is that because FreeSync is an industry open standard that any monitor controller maker can use, and as long as the graphics card, the monitor and the software/firmware all meet the minimum required spec for FreeSync, it will work on whichever vendor's GPU and monitor you choose. Let that sink in for a second.

Since it is cheaper to implement FreeSync than a GSync module, FreeSync resolves the module overheating issue as was found in some ROG monitors, gives you manual control over VSync above a certain refresh rate unlike GSync, can readily work in laptops since no cumbersome module is required, and is mono-directional thus getting you lower latency, there is no logical reason at all to cheer for GSync unless reviewers prove that GSync is scientifically superior. Based on observation of nearly every review site which has reported on FreeSync, not one reputable reporter has said that GSync appears to be superior in person.

By supporting GSync we are actually delaying market adoption, introducing higher costs, limiting our monitor choices since vendors have been slow to adopt this closed priprietary standard. All NV needs to do is support FreeSync and if they truly think GSync is better, support that too! Why is it so difficult for some to grasp this simple concept that if a GPU supports FreeSync and GSync, you have options but supporting vendor locked standard right off the bat (GSync) is only hurting the adoption of an industry open standard and limiting choices.

Those who have NV GPUs and already purchased a GSync monitor, you can keep enjoying it. No one is forcing you to buy an AMD card or a FreeSync capable monitor. For the rest of us, it would be a God send to choose from various brands, various types of TN, IPS, VA panels and 60, 120, 144HZ refresh rates, all of which is what will happen because open standards force more competition.

Not only that but Asus 1440p 120Hz IPS A-Sync monitor at $599 already proves that FreeSync is cheaper. The only 2 situations I can think of why anyone would continue to viciously oppose FreeSync becoming an Intel/AMD/NV open standard is either an NV shareholder or a blind NV loyalist. Think about the 60%+ of GPU users in the world using Intel, some of whom could play WoW, LoL, SC2, Dota 2, etc. and have adaptive sync courtesy of the industry standard FreeSync. This simply cannot be achieved with GSync, essentially segregating more than 80% of all GPUs by excluding AMD and Intel GPUs.

Given the slow trickling of GSync monitors, and most of those having unimpressive screens, it is obvious that the majority of the market is either not impressed by the overall quality of available GSync monitors, they think the pricing premium is too high, or they are making a concious decision to not support vendor lock and are waiting for the market to decide.

This strawman argument that only AMD cards support FreeSync now so they are also vendor locked is ONLY true because some people keep buying GSync, thus effectively promoting the closed priprietary standard. If no one bought GSync, then NV would have no choice but to adopt FreeSync. Of course Intel would also adopt it since they can't use GSync. We would end up with either FreeSync or with FreeSync + GSync options, in both cases allowing any GPU to benefit from any adaptive sync monitor --- that's 100% better than market segregation forcing one to choose which GPU to pair with which monitor -- that is unless one is an NV employee, NV shareholder or is an NV loyalist and hates all things AMD.
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
What the pro-GSync supporters still can't get through their head is that because FreeSync is an industry open standard that any monitor controller maker can use, and as long as the graphics card, the monitor and the software all meet the minimum required spec for FreeSync, it will work on whichever vendor's GPU and monitor you choose. Let that sink in for a second.

Since it is cheaper to implement FreeSync than a GSync module, FreeSync resolves the module overheating issue as was found in some ROG monitors, gives you manual control over VSync above a certain refresh rate unlike GSync, can readily work in laptops since no cumbersome module is required, and is mono-directional thus getting you lower latency, there is no logical reason at all to cheer for GSync unless reviewers prove that GSync is scientifically auperior. Based on observation of nearly every review site which has reported on FreeSync, not one reputable reporter has said that GSync appears to be superior.

By supporting GSync we are actually delaying market adoption, introducing higher costs, limiting our monitor choices since vendors have been slow to adopt this closed priprietary standard. All NV needs to do is support FreeSync and if they truly think GSync is better, support that too! Why is it so difficult for some to grasp this simple concept that if a GPU supports FreeSync and GSync, you have options but supporting vendor locked standard right off the bat (GSync) is only hurting the adoption of an industry open standard and limiting choices.

Those who have NV GPUs and already purchased a GSync monitor, you can keep enjoying it. No one is forcing you to buy an AMD card or a FreeSync capable monitor. For the rest of us, it would be a God send to choose from various brands, various types of TN, IPS, VA panels and 60, 120, 144HZ refresh rates, all of which is what will happen because open standards force more competition.

Not only that but Asus 1440p 120Hz IPS A-Sync monitor at $599 already proves that FreeSync is cheaper. The only 2 situations I can think of why anyone would continue to viciously oppose FreeSync becoming an Intel/AMD/NV open standard is either an NV shareholder or a blind NV loyalist. Think about the 60%+ of GPU users in the world using Intel, some of whom could play WoW, LoL, SC2, Dota 2, etc. and have adaptive sync courtesy of the industry standard FreeSync. This simply cannot be achieved with GSync, essentially segregating more than 80% of all GPUs by excluding AMD and Intel GPUs.

Given the slow trickling of GSync monitors, and most of those having unimpressive screens, it is obvious that the majority of the market is either not impressed by the overall quality of available GSync monitors, they think the pricing premium is too high, or they are making a concious decision to not support vendor lock and are waiting for the market to decide.

What overheating G-Sync modules? Got a link + evidence for that? And the 120 Hz IPS monitor is a mainstream display, it doesn't prove anything since it lacks the ROG moniker. And keep dreaming of Intel support and those random office workers playing on integrated graphics to suddenly go out and buy adaptive sync displays - I got a bridge to sell you if you really think those type of people would be in the market for an adaptive sync gaming monitor. Even if it's an open standard, if the bill of materials is higher for manufacturers, your random Intel integrated gfx guy won't spend extra $$ to buy it. If he cared about that sorta thing, he'd already own a gaming GPU.

And once again, lets say your fantasy comes true, why would NVIDIA suddenly choose to support it? It doesn't hurt them at all to continue to keep pushing G-Sync as long as they own the gaming discrete market. Now tomorrow if all of a sudden the entire add-in board market customer base decided to throw their weight behind adaptive sync and stop buying NVIDIA graphics cards, then maybe they'd do it. But realistically speaking, what are the chances of that? Probably as good as AMD staying out of the red.
 
Last edited:

gorobei

Diamond Member
Jan 7, 2007
3,964
1,448
136
There's a chance? Based on what, your crystal ball? :rolleyes: Stop posting misleading statements. Nobody besides AMD uses adaptive sync/free sync and those are the facts.

I thought the technology has been around awhile in notebooks & tablets. Variable refresh rate that is, the purpose of which is to save power, since the screen only updates when it needs to and doesn't have to run full 60 hz constantly.

Variable refresh rate for power saving is hardly the same thing as adaptive sync where the GPU controls the refresh rate in 3D applications. But you already knew that and are trying to act clever.

PSR was proposed a while ago as part of eDP and DP 1.3.
http://www.hardwaresecrets.com/article/introducing-the-panel-self-refresh-technology/1384
http://liliputing.com/2012/04/intel-future-could-use-less-power-panel-self-refresh-tech.html
intel was the main promoter back in the early ultrabook days, using it as a bulletpoint to highlight how low energy they were going to be. so its been coming for a while, but i'm not sure how widely implemented in lap/notebooks.
http://www.anandtech.com/show/7208/understanding-panel-self-refresh
anand quotes LG at a 26% power savings for PSR on soc applications. monitors dont use that much less energy, but the gpu and the memory buses can see some benefit.

the adaptive vsync addendum to 1.2a and 1.3 means PSR is coming to desktops. AMD will use it for games and video media, Intel will likely use it for PSR so they can sell corporate customers on upgrading cpu/apu to reduce power costs.

either way monitors with vesa Async will be defacto as the scaler chip makers are going with the standard.

if every new monitor going forward has Async functionality for variable refresh rate, there is very little business sense in nvidia competing with established scaler chip makers.
competing in a new market where everyone else already has established customers, supply chains, ip, experienced personnel is not something a board of directors is going to approve of just to get a niche market of gamer monitor buyers. the seeming $200 premium for the gsync FPGA just cant compete with the established chip makers and without massive changes in sales/buying doesnt justify spending millions for an ASIC version. i have yet to see a single article on nvidia going forward with the ASIC.
 
Status
Not open for further replies.