[AMD] World's First Shipping FreeSync-Enabled Displays (CES)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
BenQ seems to be the killer here

amd_series_HTp8U.png

Exactly. I agree. If Blur Reduction 2.0 isn't such a failure are 1.0 was, then this is the display I will buy. Blur-reduction for old FPS titles and Freesync for modern demanding titles. 4K isn't really feasible for gaming right now IMHO. Costs too much on GPUs.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
Oh man I'm so torn. I was dead set on getting 3 27" 4K monitors with IPS. The Dell ones seemed like a perfect deal, but then I was reading about the the Samsung Freesync offers. Now those turn out to be 28" which is to big for my desk/needs. And now along comes BenQ with a perfect gaming monitor....ugh...so hard to decide.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I hope for more 1080p freesync stuff. I'm not going to up the resolution, at least until there is huge perf/$ push in gpu market.
I would rather run maxed (with reasonable AA) 1080p than medium 1440p
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Looks like BenQ might get my business for a companion to my Overlord.

http://www.benq.us/news/2014/BenQ-Announces-New-Product-Lineup-for-the-2015-International-CES

BenQ said:
New Wide-Quad High-Definition XL2730Z

BenQ’s 27-inch XL2730Z monitor ensures gamers stay on the leaderboard by providing a WQHD 2560 x 1440 gateway into the fast-action world of gaming. Equipped with Gaming Refresh-rate Optimization Management (GROM), gamers gain the freedom to custom-build their personal gaming experience by tweaking viewing preferences such as refresh rates, display resolutions, and screen sizes. For even greater gaming comfort, the XL2730Z features BenQ’s RevolutionEyes™ technology for exceptional monitor performance by eliminating backlight flickering at all brightness levels so that gamers can engage in longer playing sessions. In addition, the monitor’s low blue light technology manages the exposure of blue spectrum light emitted by computer screens to further contribute to more comfortable viewing. The XL2730Z also comes equipped with 144Hz refresh rate, 1ms GTG response time, adaptive sync function, as well as BenQ’s Black Equalizer, Motion Blur Reduction 2.0, Auto Game, and Game to Go Modes.

An agnostic motion blur reducer is definitely welcome.

The only thing that would sway me is someone releasing a 3440x1440 with f-sync within the next couple months... well, that and terrible reviews of this one would certainly give me pause, but my current BenQ with ULMB has been good to me.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Nice. So what are the chances of a Gsync capable gaming laptop? Probably low to no chance, but I would imagine the chances of a Freesync enabled ASync laptop screen is high.
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Nice. So what are the chances of a Gsync capable gaming laptop? Probably low to no chance, but I would imagine the chances of a Freesync enabled ASync laptop screen is high.

I'd say both are possible. nV will want to push its tech and will likely work deals to make that happen.

For Freesync, it's already being demoed in at least a laptop, so in theory it's a quarter or two from market.
 

DiogoDX

Senior member
Oct 11, 2012
757
336
136
ASUS 120Hz - 1440P - IPS - Freesync

While it lacks FreeSync or G-Sync branding, the MG279Q supports the DisplayPort 1.2a specification with Adaptive-Sync. Translation: it does have a variable refresh rate, despite what we initially surmised. We're told the monitor will be able to sync up its refresh rate with FreeSync-capable Radeon graphics cards. As to why it has that capability without the FreeSync label, we don't yet have a clear answer.

First-hand, the MG279Q looks to have slightly better viewing angles than the G-Sync-enabled ROG Swift PG278Q, which we reviewed in August. That's no doubt because the MG279Q uses IPS panel technology, while the PG278Q has a TN panel. (The PG278Q still looks eminently decent, though.)

Look for the MG279Q in stores late this quarter. Asus hasn't set a price yet, so stay tuned for more details.
http://techreport.com/news/27614/asus-mg279q-display-melds-ips-panel-variable-refresh
 
Last edited:

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
If it works as well as G-Sync, why did NVIDIA need specialized hardware and claim it couldn't be easily reproduced otherwise? I'd like more established tech journalists to review these panels as well as end user experience.
 

gorobei

Diamond Member
Jan 7, 2007
4,025
1,525
136
Hardware.fr confirms that FreeSync works in games. The result looks identical to G-Sync.

http://www.hardware.fr/news/14025/ces-amd-freesync-presque-la-point.html

There you have it guys! :rolleyes:

interesting that amd is witholding freesync branding for panels that cant go lower than 45hz. while this indicates gaming is a priority for getting official branding, part of the real benefit for most people will be for video playback and power use reduction. not sure if im ready to switch to win 8/10 for my videoplayer.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
If it works as well as G-Sync, why did NVIDIA need specialized hardware and claim it couldn't be easily reproduced otherwise? I'd like more established tech journalists to review these panels as well as end user experience.

For 1 reason for sure, and possibly 2nd reason even after. 1) Displayport did not support this feature from the get go. AMD had it added to the spec before jumping in and 2) Nvidia may or may not be able to support this feature with current hardware.
 
Feb 19, 2009
10,457
10
76
"One provided by Microsoft Windows 8.1 and Windows 10 is compatible with FreeSync against and the refresh rate will adjust to the video format to ensure a perfectly smooth playback."

looks like video playback will need a compatible player to support freesync, not any player will do the job.

I hope media play classic will, cos I hate using the bloatware player that comes with recent Windows.
 
Last edited:
Feb 19, 2009
10,457
10
76
If it works as well as G-Sync, why did NVIDIA need specialized hardware and claim it couldn't be easily reproduced otherwise? I'd like more established tech journalists to review these panels as well as end user experience.

Its probably because NV went ahead before there was the standard with VESA, so they need a special hardware ($) for it. AMD worked with VESA to get a standard into DP. Because its a standard, we've seen the interest from more hardware vendors.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
If it works as well as G-Sync, why did NVIDIA need specialized hardware and claim it couldn't be easily reproduced otherwise?
There needs to be some hardware dedicated to frame syncing, in the case of Nvidia they have their own ARM processor so they stuck that on a board and made a module. The board seems overly complex and expensive to me considering what it does.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
"One provided by Microsoft Windows 8.1 and Windows 10 is compatible with FreeSync against and the refresh rate will adjust to the video format to ensure a perfectly smooth playback."

looks like video playback will need a compatible player to support freesync, not any player will do the job.

I hope media play classic will, cos I hate using the bloatware player that comes with recent Windows.

Probably the same limitation their current video stuff has.

WMP, Modern apps, and Internet Explorer video only. This should include Netflix and whatever other internet video work in IE.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If it works as well as G-Sync, why did NVIDIA need specialized hardware and claim it couldn't be easily reproduced otherwise? I'd like more established tech journalists to review these panels as well as end user experience.

AMD has a display controller on board their cards that support the feature. nVidia does not, so they required an add-on device to do it for them. nVidia also LOVES vendor lock in. Its their way or the HWY.

AMD has shown they prefer the open standard way of doing things. In this case at least, it appears as though it may work out for them. Although I would be surprised if nVidia does not release cards in the future that support a-sync, and just let g-sync die off.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
AMD has a display controller on board their cards that support the feature. nVidia does not, so they required an add-on device to do it for them. nVidia also LOVES vendor lock in. Its their way or the HWY.

AMD has shown they prefer the open standard way of doing things. In this case at least, it appears as though it may work out for them. Although I would be surprised if nVidia does not release cards in the future that support a-sync, and just let g-sync die off.

Which is why purchasing any GSync monitor now is a bad idea. I think it's inevitable that NVidia will support A-Sync in the future (as will intel), making A-Sync panels future proof with cross vendor compatibility. Besides, these new A-Sync monitors look delicious. :p
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Which is why purchasing any GSync monitor now is a bad idea. I think it's inevitable that NVidia will support A-Sync in the future (as will intel), making A-Sync panels future proof with cross vendor compatibility. Besides, these new A-Sync monitors look delicious. :p

If you want to use Nvidia and get this feature, buying now only gives you 1 option. It'll likely be a couple years and a GPU upgrade before you can have an option. Just like if you go AMD and Freesync. This is assuming they do this.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
For 1 reason for sure, and possibly 2nd reason even after. 1) Displayport did not support this feature from the get go. AMD had it added to the spec before jumping in and 2) Nvidia may or may not be able to support this feature with current hardware.

Tom Peterson is on record saying that DP did not need any of AMD's proposed additions because it already had everything needed for variable RR. I maintain that nVidia went the FPGA route because they wanted something proprietary and this is not possible by going via the standards route that AMD took.So it's either this or your 2nd reason, nVidid's display controller and signalling is not as robust as AMDs'.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
For 1 reason for sure, and possibly 2nd reason even after. 1) Displayport did not support this feature from the get go. AMD had it added to the spec before jumping in and 2) Nvidia may or may not be able to support this feature with current hardware.

Or they just wanted to be first to have something out? If you have the money to spend, doing something on your own it always faster.

If they had gone with VESA, I don't think they would have the screens out as fast as they did.

AMD had luck with the timing because of all the new controllers needed for 4K. If this would have happened 2 years earlier, we would have to wait longer for FreeSync monitors.
 

Sunaiac

Member
Dec 17, 2014
124
172
116
Yes because those Fsync sreens are all 4K.
Oh, wait.

Why did nvidia do what they do ? If you mean why did they sell 480€ 780 and screens with a 200€ price augmentation when a 350€ 290 does better with a standard screen (or a 330€ 970 when a 250€ 290 does the same) ?

Because 80% of people will buy a nVidia card, not a graphic card. They have become an end (to have an expensive and not too fast card branded nvidia) and not a mean (to play games in good conditions).

Which is actually good, because people with brains can now buy 250€ 290s with a fsync screen and have the same thing for much less, so they can put money in ... well games !

Warning issued for inflammatory language.
-- stahlhart
 
Last edited by a moderator:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Its probably because NV went ahead before there was the standard with VESA, so they need a special hardware ($) for it. AMD worked with VESA to get a standard into DP. Because its a standard, we've seen the interest from more hardware vendors.

It's more likely that nVidia wanted a proprietary solution they could charge a premium for and promote their brand.
 
Status
Not open for further replies.