[TFTCentral - Review] Acer XB270HU - 27" 144Hz IPS G-Sync

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
By default gsync on driver level will be set once monitor is detected to be gsync capable. There will be a taskbar notification saying gsync monitor has been detected and enabled. In the game you'll still need turn vsync off and maybe set refresh rate as Pia has said.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
What you guys are clearly missing is the fact the freesync appears to have little or possibly even no price premium over a monitor without it (Asus $599 IPS FS vs. TN GS $799). Why wouldn't you take a freesync monitor, even if you switch GPUs you haven't lost anything. With G-sync $150-200 premiums, you have a lot to lose and therefore tie you down.

I haven't see any statements about not being tied to a gpu maker, where are these posts? Technically it isn't even tied to a GPU manufacturer.

First, I'm not commenting on the price premium. I have made no comments about that, I don't care about the price that's not my point at all so lets just ignore that tidbit right there.

Now to the point:

I'm reading about G sync and trying to get excited, but if it's tied to only nvidia gpus it's going to be a failure.

I'd never buy a monitor for a feature that will tie me down to one graphics chip maker, it's stupid.

That Asus with adaptive sync is where the money is at, and that's where I expect the market to go.

G sync is already dead based on its proprietary chain to nvidia in a quickly evolving gaming market between AMD and nvidia and Intel.

And many more posts like this are out here on this forum. I keep reading that Gsync sucks because it's tied to one GPU manufacturer, and that Adaptive Sync is amazing because anyone can use it, yet there are only two GPU manufacturers, and one of them is using G-Sync, and they won't stop that. AMD only uses FreeSync.

How does getting a FreeSync monitor allow me to use an Nvidia GPU with the FreeSync capabilities?
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
How does getting a FreeSync monitor allow me to use an Nvidia GPU with the FreeSync capabilities?

1. It's not a FreeSync monitor. It's an Adaptive Sync monitor. FreeSync is AMD's way to utilize VESA's Adaptive Sync standard.

2. It can allow you to use an NVIDIA GPU later should NVIDIA choose to adopt Adaptive Sync and come up with a way of its own to utilize it. Where as I don't see how AMD can utilize G-Sync considering it's NVIDIA's own IP.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Some people seem incapable of grasping this. Because AMD is promoting A-Sync they assume that it must be exactly how Nvidia is promoting G-Sync.

That is probably true, but tential's point is that if Nvidia plans to go with Gsync only for the foreseeable future then Adaptive sync monitors are basically AMD only monitors. Reminds of HDDVD and Blu-ray war, where the bigger fish (Sony\Blu-ray) may win over cheaper more open solution.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
That is probably true, but tential's point is that if Nvidia plans to go with Gsync only for the foreseeable future then Adaptive sync monitors are basically AMD only monitors. Reminds of HDDVD and Blu-ray war, where the bigger fish (Sony\Blu-ray) may win over cheaper more open solution.

From what I can see though, there's a bigger chance of Intel and NVDIA adopting Adaptive Sync than of AMD and Intel adopting G-Sync because it's an industry standard, where as G-Sync is owned by NVIDIA. See PhysX.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
From what I can see though, there's a bigger chance of Intel and NVDIA adopting Adaptive Sync than of AMD and Intel adopting G-Sync because it's an industry standard, where as G-Sync is owned by NVIDIA. See PhysX.

NVIDIA won't give up G-Sync for Adaptive Sync and Intel is a non-factor in gaming so even if they adapt it, that won't mean anything to gamers.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
NVIDIA won't give up G-Sync for Adaptive Sync and Intel is a non-factor in gaming so even if they adapt it, that won't mean anything to gamers.

Which is unfortunate as it ties monitor purchase to videocard purchase for gamers. Hopefully, based on this Acer Monitor that same panels are used for both Adaptive-Sync and G-Sync so selection is not limited for gamers.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Which is unfortunate as it ties monitor purchase to videocard purchase for gamers. Hopefully, based on this Acer Monitor that same panels are used for both Adaptive-Sync and G-Sync so selection is not limited for gamers.

I agree, personally I'd rather NVIDIA support both G-Sync and DP 1.2a so people have more choices but from a business standpoint it doesn't make sense for them to do that.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
NVIDIA won't give up G-Sync for Adaptive Sync and Intel is a non-factor in gaming so even if they adapt it, that won't mean anything to gamers.

It doesn't really matter that Intel is a non-factor in gaming. If they adopt ASYNC, then every monitor manufacturer will almost certainly go for it as well (AMD and Intel combined make up about 85% of the graphics market after all), seeing as there are other advantages to ASYNC outside of gaming, plus the fact that Intel is of course not a non-factor in gaming (plenty of people play light games or games at low settings on Intel graphics, just take a look at the steam survey). This will essentially force Nvidia to adopt it as well (to big of a potential market to leave on the table).

And then GSYNC will most likely die off due to ASYNC being the more widely available and cheaper alternative thanks to economies of scale.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Not sure why people care about adoption anyways..this tech is aimed at a certain market, gamers. Not as if business or anyone else cares.

Most gamers lean amd or nvidia, so its a no brainier what monitor they want anyways.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What people care about is not being locked into a brand. Unfortunately, that is what we are going to have to live with for the foreseeable future.
 

Ichigo

Platinum Member
Sep 1, 2005
2,158
0
0
$250 Nvidia GPU + $600 monitor vs $400 AMD GPU + $450 monitor hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm


Thread crapping will not be tolerated, also this post has nothing to do with the thread topic.

-Rvenger
 
Last edited by a moderator:

moonbogg

Lifer
Jan 8, 2011
10,734
3,454
136
Panel tech is getting kinda good lately. My next upgrade should be quite the treat. No rush here though.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
It doesn't really matter that Intel is a non-factor in gaming. If they adopt ASYNC, then every monitor manufacturer will almost certainly go for it as well (AMD and Intel combined make up about 85% of the graphics market after all), seeing as there are other advantages to ASYNC outside of gaming, plus the fact that Intel is of course not a non-factor in gaming (plenty of people play light games or games at low settings on Intel graphics, just take a look at the steam survey). This will essentially force Nvidia to adopt it as well (to big of a potential market to leave on the table).

And then GSYNC will most likely die off due to ASYNC being the more widely available and cheaper alternative thanks to economies of scale.
this
down the road peeps that picked a good low priced gaming monitor might go amd if it by chance was async enabled, when upgrading the gpu.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
What people care about is not being locked into a brand. Unfortunately, that is what we are going to have to live with for the foreseeable future.

Like i said, people make up mind beforehand anyways. They don't care as long as it works is what matters to them. If i buy a $500 graphics card and a monitor with gsync comes out that is $1000 i will buy it. Even if a AMD based monitor is $500 it would not matter to me. The cost of switching video card and monitor would not make one difference.

Brand loyalty is stronger than choices. If that was not the case ad agencies would not exist. :p
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Like i said, people make up mind beforehand anyways. They don't care as long as it works is what matters to them. If i buy a $500 graphics card and a monitor with gsync comes out that is $1000 i will buy it. Even if a AMD based monitor is $500 it would not matter to me. The cost of switching video card and monitor would not make one difference.

Brand loyalty is stronger than choices. If that was not the case ad agencies would not exist. :p

Not everyone is loyal to a brand. I'm personally holding out on upgrading until I see what I'm in for. Most likely it'll be made after/when I upgrade my GPU.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Like i said, people make up mind beforehand anyways. They don't care as long as it works is what matters to them. If i buy a €423 graphics card and a monitor with gsync comes out that is €845 i will buy it. Even if a AMD based monitor is €423 it would not matter to me. The cost of switching video card and monitor would not make one difference.

So basically they are like the Apple users who don't mind paying extra :)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So basically they are like the Apple users who don't mind paying extra :)


No, read that again. He said that if he had a gpu he would buy a monitor that worked with it because it isn't worth switching venders just because the monitor is cheaper. The gpu is already in the system.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
I agree, personally I'd rather NVIDIA support both G-Sync and DP 1.2a so people have more choices but from a business standpoint it doesn't make sense for them to do that.

I wish they had done this for the 9xx series even if the feature was not enabled on release (as there are no displays for adaptive sync either) just so they can go "well, we support both so you can just choose any display you like". Unfortunately they decided to forcible push G-Sync for Nvidia cards.

For me the decision to go G-Sync was pretty easy as I know I'll be sticking to Nvidia cards in the future as well. My reason is simply that Nvidia cards are a helluva lot easier to get working properly on Hackintosh (OSX) compared to AMD cards.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
$250 Nvidia GPU + $600 monitor vs $400 AMD GPU + $450 monitor hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm

Exactly which $250 Nvidia GPU performs the same as a $400 AMD GPU? None, that I know of. It's probably closer to:

$350 Nvidia GPU + $600 monitor vs $300 AMD GPU + $450 monitor


Next time don't take the bait from other members. Let's get back on topic.


-Rvenger
 
Last edited by a moderator:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Pretty sure he's saying it's not much of a contest when you're spending the same amount of money to get a very similar monitor but you get way more AMD GPU than you would NV. If the monitors are indeed significantly cheaper for AMD GPUs, that'd be a huge boost for an AMD adaptive sync solution because the price difference would help offset concerns about getting locked in.

Let's say for point of example that the monitor will last five years or so, and in that time frame it'll see two graphics cards paired with it (the people who run faster upgrades tend to flip their cards, so the economics aren't as clean but I have a feeling they're similar). If there's a ~$100 difference, that's $50 more to spend on each card. You'd have to be deliberately timing things badly to not be able to get better performance from AMD than from a $50 cheaper NV card (and usually vice versa to be fair).

So with a big enough price delta between freesync and G-sync, the worry about lock-in sticking you with worse cards isn't actually a big worry, and that would make AMD a really sound high-end buy for the monitor+GPU+upgrade GPU solution.

(says a guy with a 970)
 

DigDog

Lifer
Jun 3, 2011
14,789
3,077
136
speaking of g-sync, if the amd competition works comparably well, i think NV will likely kill g-sync themselves. and bring out a F-sync clone, so their own way of using A-sync.

N-sync?
(i''m serious though)
nsync-matching-overalls.jpg
 

dacostafilipe

Senior member
Oct 10, 2013
810
315
136
speaking of g-sync, if the amd competition works comparably well, i think NV will likely kill g-sync themselves. and bring out a F-sync clone, so their own way of using A-sync.

Personally, I don't think that nVidia can fight all those controller manufactures.

But, from a business point of view, they certainly don't want to release a simple Adaptive Sync compatible G-Sync. This would be like saying that AMD was right from the start and this would hurt nVidia's image.

Knowing nVidia, they are already working on a Adaptive Sync compatible "G-Sync 2.0". It would introduce multiple new features (ex: VRR compatible ULMB?), that would only work on nVidia GPU's.