• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

G-Sync "Tax"

Phynaz

Lifer
First, I hate using the term "Tax", Apple Tax, G-Sync Tax, etc..

But...
My kid asked for a second monitor for his birthday to go with his GTX 1070. The cheapest IPS G-Sync display I could find is $500. What the heck, it's been over two years, the price should have dropped by now.

Ended up with a Dell Ultrsharp for half the price 🙁

P.S. This is in the Nvidia section so we don't get into another brand war.
 
It's probably not popular so the numbers made aren't many. What's the point of it really? But that's me, it has never seemed to be needed to me.

Or it could be because it is pretty much marketed to the enthusiast market, it is more of an ''enthusiast tax"
 
Gsync adds approx +/-$200 to the price of a monitor. Alienware, which as we know is Dell owned, just issued their new own branded gsync monitor, and it is also $200 more than the adaptive sync version which is consistent with the market.
 
Well that's 16:10, plus one of the better 24" panels by Dell. Should fare at least on par, if not better than most 500$ Gsync monitors in terms of image quality.

Yes, image quality is important, and the Ultrasharp displays have always been excellent in that regard. I don't think I've ever bought anything else. As a programming student he's staring at the thing all day long.

That's another one of my complaints, I don't think there are any 16:10 G-Sync displays out there. At least none that aren't more than my house payment.
 
Nvidia is going to have to cave in sooner or later to adaptive sync. There are way, way more freesync monitors available, it's cheaper, and it's almost embarrassing to see the market leader not supporting the more prolific option.
 
Eh, I'd still get Gsync. Can't understand why you'd get an Nvidia GPU and then cheap out on the monitor.
 
Eh, I'd still get Gsync. Can't understand why you'd get an Nvidia GPU and then cheap out on the monitor.

Agreed. A monitor is something you usually keep for an extended time and frankly the extra $200 is a drop in the bucket if you keep the monitor for 2 or 3 years. I've found it to be very much worth the extra cost.
 
Eh, I'd still get Gsync. Can't understand why you'd get an Nvidia GPU and then cheap out on the monitor.
Buying a U2415 doesn't imply cheaping out. Besides that, 16:10 displays are a rarity these days, and vertical pixels help if you're doing things like programming. Image quality will be better as well.
 
Eh, for programming I'd get one of those nice 32" 1440P from Samsung/Benq/HP. 24" 16:10 is nice, but those are highly recommended.

P.S. Ofc for gaming you need high Hz and Gsync, no doubt about it.
 
Nvidia is going to have to cave in sooner or later to adaptive sync. There are way, way more freesync monitors available, it's cheaper, and it's almost embarrassing to see the market leader not supporting the more prolific option.

That's what I thought. When Adaptive Sync was ratified I called G-Sync dead. That didn't happen. Adaptive Sync didn't even seem to put price pressure on G-Sync.
 
That's what I thought. When Adaptive Sync was ratified I called G-Sync dead. That didn't happen. Adaptive Sync didn't even seem to put price pressure on G-Sync.

It just doesn't make sense at this point. There are 4x as many free sync monitors on the market and they're all significantly cheaper.
 
That's what I thought. When Adaptive Sync was ratified I called G-Sync dead. That didn't happen. Adaptive Sync didn't even seem to put price pressure on G-Sync.

Here's why it isn't dead. NVIDIA has the lion's share of the gaming dGPU market and they seem to be increasingly working with the monitor/panel manufacturers to build interesting tech beyond just the G-Sync selling point. As long as NVIDIA makes sure that the "best" high-end monitors (refresh rate, resolution, panel types, etc.) are G-Sync monitors, and as long as NVIDIA can maintain its high share of the high-end of the dGPU market, G-Sync -- proprietary as it is -- will be the de facto standard, with the true "standard" adaptive sync tech being effectively a niche technology (since it doesn't work on the bulk of high-end GPUs sold today).
 
That's what I thought. When Adaptive Sync was ratified I called G-Sync dead. That didn't happen. Adaptive Sync didn't even seem to put price pressure on G-Sync.

Does NVidia even support adaptive sync in their drivers? I don't think so. Until they do, there is no incentive for NVidia owners to go with FreeSync displays, which is exactly what is required to put pressure on NVidia to lower the Gsync tax.

That said, from all the research I did prior to getting my first Gsync monitor, it seems that Gsync is superior to Adaptive Sync. But FreeSync2 is supposed to be a major improvement, so we'll see whether this provides the impetus to force NVidia to support it.
 
The only way I can see G-Sync (the currently superior tech) being supplanted is if Adaptive Sync becomes mandatory part of HDMI/DP specifications (no 4K output without full support for example) along with the new chips in monitors enabling pretty much what G-Sync does now. Right now it's mostly a patch job with limited features in comparison to G-Sync or just a really poor frequency range.

For the record I have AMD GPU's now and no Free/Adaptive/G-Sync monitors as I wait for the technology to mature. Quite how much of the "Tax" is added by the monitor manufacturers would be a interesting thing to find out.
 
Doesn't HDMI 2.1 have some kind of dynamic refresh rate requirement? If so then basically that means Nvidia won't support it.
 
Doesn't HDMI 2.1 have some kind of dynamic refresh rate requirement? If so then basically that means Nvidia won't support it.

Yes it does 😉:

  • Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.
More info from the HDMI cartel site directly :

 
So how do you folks think Nvidia will handle that situation provided it is at least equal to AdaptiveSync?
  1. Delay implementing HDMI 2.1 for years to come to milk g-sync as much as possible
  2. Heavily promote g-sync premium functionality over VRR
  3. Reduce hardware/license premium for g-sync
  4. Combination of above 3
  5. We're Nvidia, our market share and dedicated users will still pay for g-sync irrespective
 
Back
Top