[Sweclockers] Asus MG279Q is to have the same panel as Acer XB270HU

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Sweclockers have managed to get it official that Asus' answer to the Acer monitor - which was widely praised - will use the same exact panel.

That IPS display garnered a lot of positive commentary because it had seemed it bridged the one fundamental weakness that had been historically associated with IPS displays; namely response time.

The ASUS display is going to be Freesync-compatible and it's going on sale in May here in Sweden at 6500 kr. By comparison, the Acer monitor costs almost 8000 kr. So the G-sync tax is indeed intact.


Source
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
I wish people would stop calling it a tax, its not really a tax if you already have a nvidia card is it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Good news. The market really needs a 30-144Hz 1440P FreeSync IPS monitor. We need more options if FreeSync is to take off.

I wish people would stop calling it a tax, its not really a tax if you already have a nvidia card is it.

The increased monitor price comes from the GSync module. That has nothing to do with whether or not you have an NV or AMD GPU in it. With 2 identical monitors, the GSync one will cost $100-200 more because of the GSync module/licensing fees. That's called the GSync tax. If I get a Pascal NV GPU and I want an Adaptive Sync monitor, I will have to pay $100-200 extra for the GSync monitor. That means for someone who hasn't upgraded to an A-Sync monitor yet, the GSync tax exists and it's real regardless of our future GPU upgrade.
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Maybe a side by side comparison will put a lot of arguments to rest. Providing the module is the only real difference of course.

What's the minimum refresh rate on this panel?
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Good news. The market really needs a 30-144Hz 1440P FreeSync IPS monitor. We need more options if FreeSync is to take off.



The increased monitor price comes from the GSync module. That has nothing to do with whether or not you have an NV or AMD GPU in it. With 2 identical monitors, the GSync one will cost $100-200 more because of the GSync module/licensing fees. That's called the GSync tax. If I get a Pascal NV GPU and I want an Adaptive Sync monitor, I will have to pay $100-200 extra for the GSync monitor. That means for someone who hasn't upgraded to an A-Sync monitor yet, the GSync tax exists and it's real regardless of our future GPU upgrade.

That does not make any sense if you already have a nvidia card, its not a tax at all. Its just a monitor upgrade. A tax is if you buy a monitor, and have to upgrade to a new video card just to use it..in that scenario a freesync monitor IS a tax as well.

Freesync won't take off simply because nvidia is not supporting it. If it does great, but its not going to be something to consider unless the market share somehow magically went towards AMD..that looks less likely every year.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
That does not make any sense if you already have a nvidia card, its not a tax at all. Its just a monitor upgrade. A tax is if you buy a monitor, and have to upgrade to a new video card just to use it..in that scenario a freesync monitor IS a tax as well.

Freesync won't take off simply because nvidia is not supporting it. If it does great, but its not going to be something to consider unless the market share somehow magically went towards AMD..that looks less likely every year.

You apparently don't get it. I'm not sure how you cannot comprehend what he said, but, whatever, I'll give it a try to spare Russian.

The idea of the "tax" is not tied to the video card, whatsoever. If you have to upgrade your components to get a new feature, that's called the pace of hardware change. It's quite simple really, it happens all the time, and has not once, ever, been referred to as a "tax" in this regard.

Let's put it this way. Let's presume that Nvidia may support Freesync in the future, because not supporting a VESA standard is rather ridiculous in the long run. So now, you wish to buy a new monitor. The Gsync monitor costs $200 more than the Freesync monitor.

That's the Gsync tax. Gsync costs more because of the way Nvidia designed it. We're ignoring the idea of not even owning an Nvidia card. The point is, Gsync monitors cost more than Freesync monitors because of the necessary hardware that goes into the monitors.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
That does not make any sense if you already have a nvidia card, its not a tax at all. Its just a monitor upgrade. A tax is if you buy a monitor, and have to upgrade to a new video card just to use it..in that scenario a freesync monitor IS a tax as well.

Freesync won't take off simply because nvidia is not supporting it. If it does great, but its not going to be something to consider unless the market share somehow magically went towards AMD..that looks less likely every year.

I think it can help AMD market share. For me I've been stuck on nvidia since the release of Gsync, as I simply must have variable refresh rate now. A sync takes that hold away from nvidia. Also, speaking as one who just bought an Acer 270HU, I don't consider Gsync a tax, but freesync becomes that much more attractive because it is the cheaper option.
 
Dec 30, 2004
12,553
2
76
If the same panel costs 20% more for G-Sync than it does with FreeSync, that's a 20% tax on my monitor upgrade
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
PHP:
If the same panel costs 20% more for G-Sync than it does with FreeSync, that's a 20% tax on my monitor upgrade

Only if A Sync is in fact equally effective as Gsync, otherwise its not a tax it is paying a premium for a better technology. I'm just saying keep that thought in mind.
 

biostud

Lifer
Feb 27, 2003
19,672
6,760
136
Seems that minimum frequency are important to freesync, unless they fix the vsync/tearing at low fps through drivers.
 
Dec 30, 2004
12,553
2
76
PHP:

Only if A Sync is in fact equally effective as Gsync, otherwise its not a tax it is paying a premium for a better technology. I'm just saying keep that thought in mind.

yes, but that's spreading Fear/Uncertainty/Doubt. So far, we have no reason to think G-Sync is superior. Anandtech has so far been very positive about the Freesync demos they've seen.

It doesn't take a computer engineer to tell you you shouldn't need a special module. Just add a start/end of frame descriptor to the displayport datastream and don't have the monitor refresh until it sees the end of the next frame. Very, very, very simple. I'm an electrical engineer. It really is that simple.
 
Last edited:

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
yes, but that's spreading Fear/Uncertainty/Doubt. So far, we have no reason to think G-Sync is superior. Anandtech has so far been very positive about the Freesync demos they've seen.

It doesn't take a computer engineer to tell you you shouldn't need a special module. Just add a start/end of frame descriptor to the displayport datastream and don't have the monitor refresh until it sees the end of the next frame. Very, very, very simple. I'm an electrical engineer. It really is that simple.

Makes me wonder why Nvidia went the whole module route in the first place. Did they want to cash in that badly on a proprietary system knowing ahead of time full well that something like Free-Sync would pop up ?
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
ULMB is also something that's tied to G-Sync. I don't know if the module plays a role in ULMB, but if it does then it means the module is capable of doing more than matching the next refresh to the next frame
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
I think it can help AMD market share. For me I've been stuck on nvidia since the release of Gsync, as I simply must have variable refresh rate now. A sync takes that hold away from nvidia. Also, speaking as one who just bought an Acer 270HU, I don't consider Gsync a tax, but freesync becomes that much more attractive because it is the cheaper option.

I currently have a GTX 970 and I'm shopping for a new monitor in the 27" inch range. Won't even consider gsync as it puts the price point into silly land I'll call it.

144Hz IPS without gsync sounds good to me. Depending on reviews and pricing of course.
 

Riceninja

Golden Member
May 21, 2008
1,841
3
81
That does not make any sense if you already have a nvidia card, its not a tax at all. Its just a monitor upgrade. A tax is if you buy a monitor, and have to upgrade to a new video card just to use it..in that scenario a freesync monitor IS a tax as well.

when you upgrade a monitor and have to pay $200 over an identical freesync version, thats called a tax.

Freesync won't take off simply because nvidia is not supporting it. If it does great, but its not going to be something to consider unless the market share somehow magically went towards AMD..that looks less likely every year.

mass market monitors are low margin items and manufacturers will favour freesync over gsync because that extra module might be 30% of the hard costs. and when it comes to competing standards the one adopted at a higher volume will always win.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Oh my lord. Are you guys seriously comparing two monitors from two different brands, and saying the difference in price is because of the G-sync module? Its two different brands. You can`t do it like that.

Take Acer XG270HU and BenQ XL2730Z. Both identical specs and is 27" Freesync monitors, 144Hz, TN, 1440P. Acer cost $499, BenQ cost $599. BenQ doesnt have a G-sync module.

Stop with the silly comparisons please. Acer may reduce price because of Asus move now.
I welcome this Asus monitor. Not a bad price either and certainly good to have G-sync alternatives. Im glad more brands are starting to use this glorious IPS panel :thumbsup:
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,737
334
126
Will be very interesting to see comparisons between the two monitors... Min/max refresh rates, ghosting, behavior below/above variable refresh range, input lag, etc...
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
You apparently don't get it.

Again, not a tax. If i wanted a freesync panel..still have to get a video card. +$300+ for the type for this panel. Already got a Gsync card..getting monitor anyways. even less than a new card.

The savings is still there no matter how you dice it.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
It doesn't take a computer engineer to tell you you shouldn't need a special module. Just add a start/end of frame descriptor to the displayport datastream and don't have the monitor refresh until it sees the end of the next frame. Very, very, very simple. I'm an electrical engineer. It really is that simple.

Of course it's simple. Everything is simple.

I'll give you something to think about.
Maybe you can tell me the answer.

When you have a monitor that can do 30Hz - 144Hz, that actually means that the monitor can display a frame between 7 milliseconds and 33 milliseconds. G-Sync (and FreeSync) assure that everything on the screen looks smooth, as long as the monitor receives a new frame to display from the GPU, at least every 33 milliseconds.

But what does a monitor do when after 33 milliseconds, there is no new frame yet ?

There are 3 options.
1) It doesn't do anything. Result: the screen will turn white, until a new frame arrives. This gives flickering. We don't want that.
2) The monitor displays the last frame again. To do this, the monitor needs to have the last frame somewhere. The G-Sync module has memory with the last frame in it. So the G-Sync module can do this. Free-Sync can not.
3) The monitor depends on the GPU. If the interval between two frames is longer than 33 milliseconds, the GPU needs to resend its last frame.

Now there is one thing that many people tend to forget, when talking about networks. (And yes, the monitor and the PC form a network). Networks never have infinite bandwidth and they never have zero delay. In our case, DP1.2a has an effective bandwidth of 17.28Gbps. Which allows something like 180-190 1440p-frames per second. That means that when the GPU sends a new frame, there will be ~6 millisecondsbetween the first bit and the last bit of the frame.

That means that if the GPU needs to send a duplicate frame, to prevent the monitor from showing white pixels, it needs to make the decision not 33ms after it finished sending its last frame, but 27 milliseconds after sending its last frame. Otherwise the monitor will not have received the full frame when it needs to be displayed.

Did I make any mistakes so far ?

Now what happens if the GPU finished its next frame, right after is started sending the duplicate frame ? Does it stop sending the duplicate frame, and immediately start sending the new frame ? It can't. Because then the monitor will not have received the full new frame before the previous frame has expired. Even if the GPU did stop sending, you'll get tearing on the monitor.

With G-Sync, this problem is easier. The monitor has a copy of the last frame. That means the monitor can decide whether to display the last frame again or not. So now let's look what a G-Sync monitor can do. When the current frame is about the expire after 33 milliseconds, it has to make the decision to display the last frame again or not. So it has 6 milliseconds more time to make that decision !

G-Sync can do even something smarter.
When a monitor has displayed a frame for 27 milliseconds, it can look at its incoming data, and see if a new frame has started to be sent or not. If indeed a new frame is incoming, it can wait up to 6 milliseconds to receive the full new frame. And then display the new frame. The previous frame will not be displayed twice.

Now suppose that when a frame has been displayed for 27 ms, and no new frame is incoming. The monitor can then decide to show the current frame a second time. Note, the minimum holding time for a frame on the screen is 7 milliseconds (on a 144Hz monitor). Now suppose 1 microsecond later a new frame is coming in. It'll take 6 milliseconds to receive the full frame. And 1 ms later, the screen is ready to display a new frame. Hardly deviation from the points in time when frames should have been displayed.

Did I make myself clear ?

A G-Sync monitor can be smoother at low frame-rates. Because it can look 6 milliseconds into the future. A Free-Sync monitor can not.


But yeah, it's all simple. No reason to make things complicated. The G-Sync monitor is all bullocks. Any engineer can see that.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Again, not a tax. If i wanted a freesync panel..still have to get a video card. +$300+ for the type for this panel. Already got a Gsync card..getting monitor anyways. even less than a new card.

The savings is still there no matter how you dice it.

So a company charging you more for the same feature isn't a "tax"? If you didn't have an nVidia card, that you likely paid the nVidia tax on, then you could save ~$150+ on your monitor as well by not paying the nVidia tax on that too.

Now, I understand that if you prefer nVidia, or the particular nVidia card you have, then it's simply the cost of admission.