• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia and Freesync... never?

amenx

Diamond Member
What dumb excuse or reason could Nvidia have for not using freesync? Are they afraid to acknowledge the competitions free alternative to their own proprietary solution may in fact make more sense? For both consumers and manufacturers? Some loss of face may be involved but they should swallow their pride, accept it and just allow their GPUs to be compatible with freesync. In fact would make them look better and more forward thinking. So wth? Any reasonable rationale involving tech or business, strategy matters? Or can it be just pure, dumb pettiness?
 
It's pure business. Just one more value added feature for Nvidia (Gsnyc). When you control almost 80% dGPU market share you do what you want. Until market forces change things around there is no good business reason for NV to support freesync.
 
I thought the new upcoming HDMI spec was going to include freesync so then all GPU's supporting new HDMI spec would also support freesync. Or is that not the case?
 
It's pure business. Just one more value added feature for Nvidia (Gsnyc). When you control almost 80% dGPU market share you do what you want. Until market forces change things around there is no good business reason for NV to support freesync.
Sure, but there are more freesync than gsync displays in the market. And the way the trend is going it could backfire on Nvidia. To exclusively stick to your own more expensive solution with a smaller compatibility list doesnt sound like good business sense to me. I'm all for Gsync as a proprietary tech, but why punish your customers by not also making their cards work with freesync displays. Thats my gripe. The display/cost, to me, is just as important as which GPU I choose, whether NV or AMD. If I could save a couple hundred $ on the display due to freesync vs gsync, Nvidia may lose me as a customer.
 
If AMD manages to command marketshare and mindshare for a couple straight years, then maybe we'll see G-Sync open up a bit more, but it's unlikely.

HDMI 2.1 includes a variable refresh protocol (called Game Mode VRR), which AMD says they will support with FreeSync, but it will be voluntary for TV and monitor makers to support it, just like FreeSync. Of course, with Xbone X supporting HDMI 2.1 and FreeSync2, this will help FreeSync cement itself as the "home theater" standard for VRR unless NVIDIA somehow gets G-Sync modules into TVs.

Game Mode VRR

Q: Does this require the new HDMI cable?

A: No

Q: Will this work with 8K@60 or 4K@120Hz?

A: Yes if those features are implemented along with Higher Video Resolution. That will require the new 48G cable

Q: Is this primarily for consoles or will PCs utilize this also?

A: It can be used for both.

Q: Will this result in more gaming PCs connecting to HDMI displays, either monitors or TVs?

A: The intent of the feature is to enable HDMI technology to be used in these applications. Given that HDMI connectivity already has a strong presence in this area, we expect that use of HDMI technology in gaming will continue to grow.
 
Sure, but there are more freesync than gsync displays in the market. And the way the trend is going it could backfire on Nvidia. To exclusively stick to your own more expensive solution with a smaller compatibility list doesnt sound like good business sense to me. I'm all for Gsync as a proprietary tech, but why punish your customers by not also making their cards work with freesync displays. Thats my gripe. The display/cost, to me, is just as important as which GPU I choose, whether NV or AMD. If I could save a couple hundred $ on the display due to freesync vs gsync, Nvidia may lose me as a customer.


I don't really care about display cost since I usually keep it for 3 to 4 years. $200-$300 additional dollars over that time range is insignificant to me. Heck, if I upgraded a monitor every other year it still wouldn't matter to me.
 
I would say it's strategy. Freesync/adaptive sync is an industry standard like DNLA which means manufacturers can implement it without paying nvidia to license g-sync. It creates an additional revenue stream for nvidia. Apple do something similar with their home media stuff it uses a proprietary networking protocol. Which locks everyone else out unless they pay Apple a licensing fee. It's good business I guess. Both g-sync and apples networking protocol are probably built on top of the existing protocols. Adaptive sync for the former and TCP/IP for the latter. There are only so many ways you can skin a cat.
 
What dumb excuse or reason could Nvidia have for not using freesync? Are they afraid to acknowledge the competitions free alternative to their own proprietary solution may in fact make more sense? For both consumers and manufacturers? Some loss of face may be involved but they should swallow their pride, accept it and just allow their GPUs to be compatible with freesync. In fact would make them look better and more forward thinking. So wth? Any reasonable rationale involving tech or business, strategy matters? Or can it be just pure, dumb pettiness?

FreeSync is trademarked by AMD, competitors are not in the business of effectively advertising the "other" team. Adaptive Sync on the other hand is vendor neutral industry standard (FreeSync is AMD's name for it effectively) so Nvidia would be pretty stupid to advertise/use FreeSync when Adaptive Sync doesn't link them to AMD's marketing team.

FreeSync and GSync can both get tossed in the garbage if Adaptive Sync can surpass them without the marketing/pricing issues with the former two.
 
For one, Freesync doesn't equal Gsync. With Gsync NVIDIA can ensure that their standards are kept, vs Freesync which is all over the place. It's probably a matter of control.
NVIDIA can exert just as much control if they moved to an implementation of adaptive sync. The only reason Freesync is "all over the place" is because AMD allows it to be, which is a good thing in my opinion. If AMD demanded higher standards, we lose budget FS monitors entirely. With the way they do it, we get both high-end and cheaper options.

NVIDIA would have their own branding and their own standards because they wouldn't be adapting Freesync, they'd be designing their own implementation of displayport's adaptive sync. The difference is that this hypothetical monitor would work with AMD GPUs as well, through Freesync. The only reason they keep G-Sync is to lock customers into their ecosystem and sell monitors for higher prices. Nothing about it benefits any of us.
 
FreeSync is trademarked by AMD, competitors are not in the business of effectively advertising the "other" team. Adaptive Sync on the other hand is vendor neutral industry standard (FreeSync is AMD's name for it effectively) so Nvidia would be pretty stupid to advertise/use FreeSync when Adaptive Sync doesn't link them to AMD's marketing team.

FreeSync and GSync can both get tossed in the garbage if Adaptive Sync can surpass them without the marketing/pricing issues with the former two.
Thanks for clearing that up. Its essentially what I meant, the open source tech behind freesync, not necessarily freesync by name.
 
Supposedly there's a new hdmi standard coming which mandates something similar to gsync/freesync. Until then we're vendor locked.
 
Nvidia do it for 2 reasons:
1) to make money from licensing. This is the reason for a businesses existence after all.
2) enforce a high quality standard, which is required for Nvidia to maintain their premium brand. So sure their might be a lot of freesync displays out there but most of them don't come up to the min required spec of every gsync display ever released.
 
"I don't care about cost!" "Money is insignificant to me!"... I can see why you're an NVidia customer. 😛

That's not true. Context matters here. Spending an extra $200-$300 dollars over a 3 year period is nothing. If it is a person shouldn't be buying PC gear.
 
That's not true. Context matters here. Spending an extra $200-$300 dollars over a 3 year period is nothing. If it is a person shouldn't be buying PC gear.

However, I'm a Nvidia customer because they are the only graphics company that gives me the high end performance I want.
 
However, I'm a Nvidia customer because they are the only graphics company that gives me the high end performance I want.
Fair enough. But you mean "for gaming", right? Because Vega56 and Vega64 are one heck of a content-creation card, as I understand it, even if it is a little underwhelming for current games.
 
That's not true. Context matters here. Spending an extra $200-$300 dollars over a 3 year period is nothing. If it is a person shouldn't be buying PC gear.

Well that's kindof a crappy attitude to have to people who can't drop 2-3k every couple of years on a new setup. Nothing wrong with only dropping a few hundred every couple of years to keep your system running the newer games.

For those guys, dropping an extra $300 for Gsync is out of the question.
 
Fair enough. But you mean "for gaming", right? Because Vega56 and Vega64 are one heck of a content-creation card, as I understand it, even if it is a little underwhelming for current games.

I'm a gamer. So I'm talking about gaming. I like as high FPS as possible with all the bells and whistles turned on. I know others don't mind lessor settings, but my comments are meant for folks who game at the high end. If a person can afford a $500-$650 video card I believe spending the extra $200/$300 on a monitor is a no brainer.
 
Well that's kindof a crappy attitude to have to people who can't drop 2-3k every couple of years on a new setup. Nothing wrong with only dropping a few hundred every couple of years to keep your system running the newer games.

For those guys, dropping an extra $300 for Gsync is out of the question.

Check my post above. Also, I'm not talking about buying a completely new setup every couple of years. As a matter of fact I mentioned I like to keep my monitors 3 to 4 years. That's why I said TCO becomes very affordable when you buy a Gsync monitor.

$900 Gsync = $225 per year = $18.75 per month

$600 Freesync = $150 per year = $12.50 per month
 
That's not true. Context matters here. Spending an extra $200-$300 dollars over a 3 year period is nothing. If it is a person shouldn't be buying PC gear.

It might not be $200-300 more every 3yrs though.

1) What happens when a fancy new monitor comes out that is only supported by Freesync (i.e Samsung CHG90, Acer XR382CQK, LG 38UC99-W, some OLED variant, etc.). Now you either have to wait for someone to come out with a G-Sync version or sell your Nvidia gear and buy the new monitor.

or

2) Some uber G-Sync monitor comes out a year after you bought your current monitor and you have to pony up the cash again not only for the monitor but for the G-Sync tax.

In the past, monitor progress was pretty slow and it wasn't worth upgrading often. These days we're getting all sorts of new options and tech. That's what sucks about the current situation with Nvidia not supporting Adaptive Sync and with AMD not bringing much to the GPU fight.
 
Well that's kindof a crappy attitude to have to people who can't drop 2-3k every couple of years on a new setup. Nothing wrong with only dropping a few hundred every couple of years to keep your system running the newer games.

For those guys, dropping an extra $300 for Gsync is out of the question.

The $200 premium for a G-Sync display is less than 20 cents a day if you keep it for three years. If you are like me and keep displays for a decade then you only need 5 cents a day.

Pocket change.

Considering the alternative is using an AMD video card, you could probably make it up on the power bill.
 
The $200 premium for a G-Sync display is less than 20 cents a day if you keep it for three years. If you are like me and keep displays for a decade then you only need 5 cents a day.

Pocket change.

Considering the alternative is using an AMD video card, you could probably make it up on the power bill.

Yeah. And there's the simple fact that if you go AMD, you are going with a supplier that is less reliable in delivering new cards compared to NVIDIA.

GTX 1080 launched in May of 2016. It wasn't until August 2017 that AMD FreeSync users had the honor of buying a roughly equivalent card to the 1080, but for more money. Oh, and it guzzles more power, too.
 
Yeah. And there's the simple fact that if you go AMD, you are going with a supplier that is less reliable in delivering new cards compared to NVIDIA.

GTX 1080 launched in May of 2016. It wasn't until August 2017 that AMD FreeSync users had the honor of buying a roughly equivalent card to the 1080, but for more money. Oh, and it guzzles more power, too.

Honestly, that's why I had no problem picking up a Gsync monitor six months ago. I realized that despite using both AMD and NVIDIA cards for over a decade I would be on NVIDIA for at least the next 5 years so it was a non-issue. Before that I had the mindset that I "didn't want to get locked into one ecosystem" but then realized that since AMD will fail to deliver for at least the next three years, I'll be on NVIDIA for the next five.
 
Back
Top