• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Freesync Monitors & Reviews Thread

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I want 38"+ Ultra wide, 4K+, 90hz+, Freesync, and curved. When a monitor meeting this criteria (or even the vast majority of it) comes out, I'll buy it if it's less than 3 grand.
 
Its DP1.2 vesa standard. The scalar companies is onboard.
Every new monitor i half a year will support it. Some will just disable the functionality and slap a gsynch label on and add 100usd to the price. Its simple but good tech. Thank you for the idea nv. Now we move on.

In my opinion I think waiting another year is a good idea on these. I mean, if Nvidia is truthful they won't ever support adaptive sync in their driver. Thus you can't use gsync monitors with AMD and you can't use freesync/adaptive sync monitors with Nvidia. You're stuck either way because the life of a monitor is usually quite a bit longer than a GPU upgrade cycle. Hate to buy a nice monitor then decide the card you want won't work with it. Well, it will work but you don't get the gsync/adaptive sync benefit. Maybe in a year or so the market will decide, if not at the very least there will be better displays.
 
I have no doubt we'll see as 1440p120+ IPS monitor with Freesync, in fact I think ASUS's first 1440p120+ IPS (MG279Q) is likely to support Freesync

the main issue for me is whether or not it also has some sort of strobing feature to compete with the ULMB side of G-Sync, as that is a feature Freesync does not address.


You do know that ULMB and Gsync are functionally exclusive? You can't use ULMB with Gsync. You have to turn Gsync off.
 
You do know that ULMB and Gsync are functionally exclusive? You can't use ULMB with Gsync. You have to turn Gsync off.

yes, but also can't have ULMB outside of a G-Sync monitor, although there are some alternatives but nothing as standardized

at any rate its a great combo; G-Sync for when you can't maintain a minimum framerate of 85fps, ULMB for when you can.
 
Maybe in a year or so the market will decide, if not at the very least there will be better displays.

Freesync will win the overall market due to widespread DisplayPort 1.2a adoption, but this choice wont be decided by AMD or Nvidia users but by Intel since Broadwell supports 1.2a. I suspect Nvidia will have to adopt some form of open 1.2a support for its mobile GPUs since I doubt the laptop market will be as willing to pay a G-SYNC premium. Whether G-SYNC survives long term as a premium sync alternative with unique features for Nvidia users remains to be seen.
 
Freesync will win the overall market due to widespread DisplayPort 1.2a adoption, but this choice wont be decided by AMD or Nvidia users but by Intel since Broadwell supports 1.2a. I suspect Nvidia will have to adopt some form of open 1.2a support for its mobile GPUs since I doubt the laptop market will be as willing to pay a G-SYNC premium. Whether G-SYNC survives long term as a premium sync alternative with unique features for Nvidia users remains to be seen.

This.
All those argument about eg. a french site seeing ghosting on a display they tested mean ofcource zerro in the broader DP 1.2a adoption picture. Its nitpicking. Excactly like AMD claiming their solution is only taking 0.2 percent performance while NV is taking 1.5% or so. Who cares. Its not enthusiast deciding on the market.
 
I will take the LG 29UM67 as soon as it's available here.

Will not be using FreeSync most of the time because OSX, but the 75hz certainly will be a benefit.
 
Unlike Mantle which nobody will use, and the non-existent hardware physics that was promised but never delivered, Freesync is actually good news.

It's a working solution that's an open standard. It should lower the prices of gsync monitors and create more widespread adoption of the new display paradigm as a whole.
 
Thanks.

I'll avoid these then. The review also point out what I suspect, 40hz is too high a min where games will and often drop to 30 or even less. Basically, the lower the better, 30 seems the acceptable mark.

Gonna have to wait for better models to upgrade.

Yep. And it seems when using Freesync/G-Sync, if you get below the minimum FPS. You end up with a worse solution than a regular monitor.

Imagine 4K with 40 or 48FPS minimums...
 
http://forums.overclockers.co.uk/showpost.php?p=27799387&postcount=176
Is it really as simple as turning off something at the monitor to eliminate the ghosting?

It's not an absolute solution as some monitors do not have the right option.

The BenQ does seem to according to owners.

The Acer does have that options, but it does somehow not work on lower frequency (seem like a firmware issue) but it's fine at 144hz.

The LG should have that option because the UM65 has it.
 
Imagine 4K with 40 or 48FPS minimums...

That's all you can do is imagine them since they don't exist. Let's see what the support when they are released.

40 is probably fine for me, a range up to 75+ would be nice. I try to limit my settings to sit at a constant monitor refresh rate, but supporting the occasional drop is the bonus I would see freesync supporting.

I'm glad it's free since there are quite a few things I want in a monitor and I'm certainly not willing to pay $200 for the same experience, especially for a single feature.
 
AMD says different components are needed.

I hope this is not the case, especially if G-Sync module does indeed use the buffer to do overdrive calculations on a per frame basis to eliminate ghosting. I assume current monitor hardware do overdrive\blur reduction based on constant refresh rate (so programmed timings and no calculation needed) to eliminate the need and extra cost of a buffer and silicon to perform overdrive calculations.
 
That's all you can do is imagine them since they don't exist. Let's see what the support when they are released.

40 is probably fine for me, a range up to 75+ would be nice. I try to limit my settings to sit at a constant monitor refresh rate, but supporting the occasional drop is the bonus I would see freesync supporting.

I'm glad it's free since there are quite a few things I want in a monitor and I'm certainly not willing to pay $200 for the same experience, especially for a single feature.

The perspective doesnt look good from the first 4 monitors and 3 manufactors.

And its less than 200$. The closest compare we got for now is the Swift vs the BenQ with a 150$ delta. But the BenQ is nowhere near the Swift quality.
 
The perspective doesnt look good from the first 4 monitors and 3 manufactors.

And its less than 200$. The closest compare we got for now is the Swift vs the BenQ with a 150$ delta. But the BenQ is nowhere near the Swift quality.

Less than $200? How about $160-$260 differences.

Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage.
For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift.
http://www.anandtech.com/show/9097/the-amd-freesync-review/2
 
The Acer doesnt use the same Panel. It's a 6bit+FRC panel unlike the true 8bit panel of the Swift and maybe BenQ.
 
I dont see the Acer in your link.

The BenQ:
http://www.amazon.com/BenQ-XL2730Z-...8&qid=1426849021&sr=1-1&keywords=BenQ+XL2730Z
630-707$

The Swift:
http://www.amazon.com/dp/B00MSOND8C?tag=vglnkc3350-20
724-759$

Thats 100$ or less. And the swift doesnt have ghosting issues and works from 30FPS. While the BenQ suffers from ghosting and requries 40FPS.

pursuit2.jpg


Of course the Swift has ghosting. As you can see the slower the refresh the more pronounced it is.

http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg278q.htm
 
Back
Top