All Samsung's Ultra HD monitors in 2015 to support FreeSync

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SoulWager

Member
Jan 23, 2013
155
0
71
I was waiting for MIMO powerline adapters for a long time. I never felt the need for the companies to prove that these powerline adapters were "faster" months before the release. There wasn't a single review until after the product was out.

I read the reviews then then decided if I wanted to buy the product.

Why consumers feel they are entitled to see a product far before it's released is beyond me. Besides, people here seriously would make a purchasing decision on Freesync based on a review that comes out on it today, when there isn't a product available, a launch date of a product with Freesync announced yet, etc. Freesync could be the worst thing today, months before its launch in a retail product, and be the best thing later. All a review does today is tell you about its performance today. It tells you nothing about the performance of the retail product available to us months down the line....
AMD said they'd start sampling displays in september and october. Is that not a good enough reason to expect a review by now?

AMD's whole PR strategy was trying to convince people these displays were coming out a lot earlier than it was reasonable to believe.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I hope this gets quick wide adoption so NVidia and Intel have to support it in the future. It would be a big win for consumers.

Because there are only 2 discrete players and Intel, they may not have to support it. If there were more players, it would come to that, but if Nvidia has their solution and AMD has another, and Intel just ignores it like they often do, there is nothing to force adoption to a single standard.

While I'd rather there was one, this isn't a case where it is definite.
 
Feb 19, 2009
10,457
10
76
Some people: The glass is half-full.

Some other people: That's not even a glass. That's just a bunch of lies and marketing hype and why did they even say anything about it with over a year needed to get it ready for manufacturers to even put something out and there aren't even reviews or demos of it holding any water and what if you put something other than water in it and no I cannot wait for a significant step in an industry that was basically stagnant for years and is only now starting to add new and interesting features to glasses.

All that aside, at least people can stop calling it vaporware.

Heh, I recall when Freesync was announced, many called it vaporware, a distraction, a marketing stunt with no products.

It's the same as when Mantle was announced.

Since when is announcing an upcoming feature an act so deserving of hate and bile?
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,802
3,250
136
People here should watch john carmacks occulas/ gear vr keynotes before jumping the gun on who lied about what, what hardware right now can and can't do etc.

it is a far more complex matter then simply no additional hardware required/needs new hardware, herder herder lier!

dont you people get tired of the same old boring battle lines :eek:
 

garagisti

Senior member
Aug 7, 2007
592
7
81
I hope this gets quick wide adoption so NVidia and Intel have to support it in the future. It would be a big win for consumers.
Intel actually already does with the mobile processors/ equivalents of APUs. It is used such as it enables power savings. It shouldn't be too hard to implement the same in the desktop products. AMD is also on the same page with their APU's iirc. That leaves out Nvidia. If i know any better, they will push G-Sync for all they could milk out of it. Not a bad plan really. If you read the forums here and elsewhere, you get an impression that they have a bunch of customers ready to fork their monies over. I would be quite daft a CEO if i didn't take their monies.

The above mentioned is not an attempt to say that their product doesn't work... or is bad. It is a solution. However, it is always bad if you lock down an industry to proprietary standards, as it gives monopolistic powers to owners, and has an effect of increased costs to end customers. Take the case of HDMI. Displayport for whatever it is worth could do a lot more, but since HDMI is an established standard, it is hard to do away with. Then there's the matter of cost and DP is much cheaper. IIRC, Intel is owner of patents for HDMI.
 
Last edited:

garagisti

Senior member
Aug 7, 2007
592
7
81
AMD said they'd start sampling displays in september and october. Is that not a good enough reason to expect a review by now?

AMD's whole PR strategy was trying to convince people these displays were coming out a lot earlier than it was reasonable to believe.
Ahem, so a manufacturer will make a display, and AMD will prepare drivers. Is it really unreasonable to think that it may take some time?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
AMD's whole PR strategy was trying to convince people these displays were coming out a lot earlier than it was reasonable to believe.
According to who? Is there a manual I can read that outlines what is an acceptable announce to market time frame? DirectX 12 was announced March 2014, and we won't see it until the end of 2015 (maybe) and I don't see anyone up in arms about that.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
Ahem, so a manufacturer will make a display, and AMD will prepare drivers. Is it really unreasonable to think that it may take some time?
just saying

scalier chips might take time to make. chip are out but still need to be fitted in new or old panels ?

lcd panels might need to be tweeted for color shift of the changing switching speed.

or getting rid of old stock before it's too out of date .
etc.etc.

=but hey these guys must have 3d printers to make a A sync monitor to review for you.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
just saying

scalier chips might take time to make. chip are out but still need to be fitted in new or old panels ?

lcd panels might need to be tweeted for color shift of the changing switching speed.

or getting rid of old stock before it's too out of date .
etc.etc.

=but hey these guys must have 3d printers to make a A sync monitor to review for you.

I think alot of it is that they have to line up with the launching of a new series of monitors.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Nvidia is not stupid. I very much doubt they would risk being heavily invested in particular technology,
when the same result can be achieved with already available tech, with VESA DP update, firmware update, AND/OR with better scaler.
The idea of sending frames to display at the rate at which they are available, and pacing the display refresh rate to follow GPU frame generation is simple enough.
But every great idea is simple at it's core, and every implementation of this idea becomes a real-world problem - and solving a real world problem is never a simple task.

So far FreeSync in action has been seen by AMD family members and few of their friends. Contrary to how Gsync had been publicly presented and demoed. And heavily praised(!).

All this is what's leading me to that speculation of AMD having early problems. And Gsync being better, more advanced solution.

It's Xmas season - sales will be up for everyone.
Every G-Sync monitor sold is a customer locked down for foreseeable future. And AMD doesn't even have an engineering sample or convincing demo to proudly show to the world.
Yet come march Samsung is shipping.
Why not yell from the top of their lungs about G-sync killer... weird, no?



Saying FreeSync supports 9Hz is a bit convoluted.
It's Display/panel that can go to 9Hz.
And even then, what good is 9fps, not to mention...khm... 30'' at 9Hz(!)LOL?
I doubt there is any need for this, except for marketing purpose.
But hey, why not go for it. It certainly doesn't hurt: 9Hz - 240Hz

I've read numerous articles watched interviews etc... to come to the conclusions I have. TBH it would take a lot of time and effort to find all of the info. I'll leave you to your opinion and simply bookmark this for future reference when Freesync demos are available. I think you'll find that you are underestimating Freesync.

As far as why nVidia did it the way they did, how about so they could lock it to their hardware? That's all the reason nVidia would need.

Until then, cheers. :cool:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Right, its called marketing. AMD paid samsung to have it in monitors. In a year, that agreement will end and back to normal. Standard issue stuff.

Wut? How do I get in on this AMD gravy train. Paying devs, paying IHV's, I suppose paying Khronos to make it a standard? Surely they must have paid Apple to use their GPU's, Sony and Msft to use their APU's, Adobe to implement OpenCL instead of just sticking with CUDA, etc...

Come on, at least if you are going to make claims have something/anything to support it.

I don't see how g-sync goes obsolete. It's not like we can compare performance or price yet. Even if AMD is going to be competitive, it's not like someone with a g-sync compatible video card will be buying an adaptive-sync display.

If you're ONLY buying a monitor, and already have one of the variable refresh compatible AMD video cards, wait for these to be released.

If you're buying both a video card and a monitor, maybe wait, maybe get something now, depends on budget and what you need it for.

Otherwise, this doesn't really change any purchase decisions.

The world isn't just red and green, you know. There's blue as well and if blue wants to have variable refresh, and why wouldn't they, there's a free and open standard set for them. I think you're goingto find it won't be AMD vs. nVidia but nVidia vs. the world.

So why is Samsung announcing it will only be in their 2015 4K monitors? Why not all of them?

The scalers are pretty new. A likely reason would be not enough for everyone to have them in all models without demand adversely affecting the price.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
AMD said they'd start sampling displays in september and october. Is that not a good enough reason to expect a review by now?

AMD's whole PR strategy was trying to convince people these displays were coming out a lot earlier than it was reasonable to believe.

nVidia said Gsync monitors would be available in qtr 2 this year. It was half way through the 3rd qtr before the first one shipped. You didn't see a single Gsync thread devolve into berating nVidia over it. The worst part about it is the people who are doing all of the complaining are nVidia users who can't use it anyway.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Heh, I recall when Freesync was announced, many called it vaporware, a distraction, a marketing stunt with no products.

It's the same as when Mantle was announced.

Since when is announcing an upcoming feature an act so deserving of hate and bile?

- FreeSync slowly emerging as an adaptive refresh tech that aims to become industry standard,

- Mantle proving that reduction in draw calls causes a significant increase in performance in CPU limited scenarios,

- High end GPUs from one vendor rumoured to be going WC for superior cooling and noise levels.

All of this tech is being pushed by AMD and in all of the threads related to any of these topics, the same people who own products from another popular PC vendor always have negative things to talk about or purposely downplay these new advancements.

You connect the dots.

Personally, I will support open standards if the option is there. That means we still have the power to buy FreeSync monitors and force NV to support it over time. Then I don't have to worry what GPU vendor's card to buy in the next 7-10 years I own that monitor (my current monitor is already 7.5 ears old). However, if we start buying G-Sync monitors, we are not going to get NV to change their stance. The end result is that we will be vendor locked with either AMD and FreeSync or NV and G-Sync for the entire duration of our LCD monitor's lifetime.

This idea that I am promoting vendor lock over an open standard is so against my values, that I will not buy a G-Sync monitor because I don't want tech to be in the way of my buying choices and I will send that message by voting with my wallet. If NV changes its stance and offers support for both GSync and FreeSync if they think GSync is superior, I have no issue with that.

Since FreeSync/GSync benefit the most at < 60 fps, and since 4K IPS monitors are still not DP1.3 + HDMI 2.0 capable at a reasonable price, it doesn't matter that FreeSync will launch in 1st or 2nd half of 2015. What matters is that in the next 3-5 years there will be a major push into 4K PC gaming and AMD's FreeSync must work well in time for when the 4K IPS/VA adoption skyrockets. Until the inevitable 4K revolution takes place, saying that AMD is late misses the point of where we are in the 4K adoption product cycle.

The fact that GSync beat it to market is good for NV but for most consumers it changed little since so few GSync monitors are out and only 1-2 of them (Swift) is actually any good. For new tech to become mainstream, we need 100s of monitors to support it. Thus far GSync beat FreeSync to launch but it has failed to gain any mainstream traction. My feeling is the general market opposes being locked into a GPU upgrade path, most now are waiting to jump to 4K, and most know that 99% of 1080/1440P TN GSync monitors are not very good as monitors to begin with.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
nVidia said Gsync monitors would be available in qtr 2 this year. It was half way through the 3rd qtr before the first one shipped. You didn't see a single Gsync thread devolve into berating nVidia over it. The worst part about it is the people who are doing all of the complaining are nVidia users who can't use it anyway.

Exactly. The ONLY decent GSync monitor is the Swift and that one is plagued by a ridiculous high price (32" 4K IPS BenQ is $999!), it has major quality control issues (gamers reporting that they are on their 2nd-4th model due to overheating or dead pixels, color accuracy/color shift issues after some months of ownership as confirmed by European reviewers and some gamers) and its supply is very problematic. Thus, despite GSync being great on paper, in practice as a mainstream solution for gamers it really isn't taking off and the options/adoption by LCD monitor manufacturers is terrible for it.

Samsung is one of the major sellers of LCDs. If AMD gets Dell, Acer, ASUS and LG onboard for FreeSync, then GSync will suddenly be way behind in adoption. Considering that ALL of Samsung's 4K monitors starting in 2015 will support FreeSync, that means it's just a matter of time before other major manufacturers jump onboard as they will be unlikely to allow only Samsung to have this competitive feature advantage.

Finally, if GSync doesn't make it to high end 4K-5K IPS/VA panels long-term, it will face major adoption issues for premium PC gaming.
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
nVidia said Gsync monitors would be available in qtr 2 this year. It was half way through the 3rd qtr before the first one shipped. You didn't see a single Gsync thread devolve into berating nVidia over it. The worst part about it is the people who are doing all of the complaining are nVidia users who can't use it anyway.

I'm sure someone could enlighten us as to how Gsync delays are AMD's fault. Nvidia fanatics are great at making everything AMD's fault. They got so good at spinning it that Ubisoft thought they could try that strategy out too.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I'm sure someone could enlighten us as to how Gsync delays are AMD's fault. Nvidia fanatics are great at making everything AMD's fault. They got so good at spinning it that Ubisoft thought they could try that strategy out too.

The converse is also true. Isn't it?
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Is this Samsung's attempt to get back into the gaming monitor market? They used to make 120+Hz monitors, but they've all been discontinued and now only have 60Hz monitors available. Since these variable refresh rate monitors are aimed at gaming (or at least that is the aspect most of us here care about), I hope they decide to come out with panels that go above 60Hz. G-sync is in display manufacturers who already had "gaming monitors" in the field (ASUS, BenQ, Acer), hopefully Samsung unleashes their awesome displays for gamers.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
What I expect we will see at first is at 4k 60hz max, and at lower resolution would see 144hz or maybe higher.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Source? I haven't seen a word about paying anyone and I assume they didn't.

That is how it works. Same with gaming conventions, nvidia pays them to put logo and use nvidia cards in the computers. Its all marketing to get brand recognition.
 
Status
Not open for further replies.