monstercameron
Diamond Member
- Feb 12, 2013
- 3,818
- 1
- 0
snip
OT, did you expect anyone to read that wot?
OT, I just don't get the fuss?
snip
but everyone of the pro Gsync peeps in these threads have spent the $250-2k extra to be running nv high end cards so buying a $800 tn Gsync monitor shouldn't be a issue for every anti AS poster here.The obvious one: $$$
Let's see how strong adoption is among folks before leaping to conclusions about its place in the market.
No hatred. Just realism.You are all very confused about my post based upon your apparent hatred of all things amd.
What will Nvidia be supporting that? What makes you so sure? It's optional: if they are connected to a AS monitor they can ignore it completely if they want to.Nvidia will eventually be supporting a VESA standard (adaptive sync) not an AMD driver (freesync) that you are all apparently confused about.
Because gaming on 4k requires way too heavy GPUs? Because 4k144Hz panels don't exist? Because there isn't even an interface standard that can carry 4k144Hz over a single DP cable? Because 2560x1440 is the sweet spot right now?My comment about the quality and resolution still stands. Why buy a 1440p gsync monitor when 4k is coming?
What you're saying is: those sold out Swift monitors in Europe were all bought by idiots who don't know any better. But maybe they had the smarts, lacking with others, that 4K is just too heavy and low refresh rates for current GPUs? Could that be it?There will be 4k versions of gsync and adaptive sync. Buying 1440p now would be stupid when most users keep monitors for much longer than their other hardware.
Nvidia themselves have said in the past that they're not interested in licensing. So I don't get why you're bringing that up.Now finally on to Intel and arm. Why would they enter into a cross licensing agreement with nvidia setting themselves up for potential lawsuits later on if the deal goes south when they could just use the open VESA standard. Adaptive Sync's potential to reduce power usage is why I believe Intel and arm chip makers will support it.
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.
VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.
I'm really not sure why I'm trying to use rational thought in vc&g when fud, slander, marketing, and accusations of wrongdoing are all half the people in here talk about.
the ignore list is super useful, half the threads are just blanked out for me, what I don't get is the vitriol against freesync and towards AMD.
Just how much do you guys think AMD has control over? If AMD says freesync is free, does translate to the display manufacturers also saying its free, think about it... just why they don't want their brands shown in demos.
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.
VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.
I'm really not sure why I'm trying to use rational thought in vc&g when fud, slander, marketing, and accusations of wrongdoing are all half the people in here talk about.
Would you care to explain? I have no idea what you're trying to say.Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.
How did Ultrabooks suddenly become a topic in a Freesync discussion?VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.
Have you ever heard of Bulldozer?
Please post proof of them lying about freesync.The reason for the vitriol is that AMD has been proven to have been lying, repeatedly, from the very start, about just about everything relating to FreeSync.
Why shouldn't we react negatively to that? Why aren't you reacting negatively to that? What I don't get is the unflinching loyalty to AMD even after they lie to your face.
What's very different is variable refresh, where you're changing the frame interval dynamically based on how long it takes to render a frame - effectively not having a refresh rate at all. So far, that hasn't been implemented by anyone but Nvidia. It's a fundamentally superior way of displaying video, so I don't doubt that it will become universal eventually, but there hasn't been any indication that Intel or ARM has been moving toward it with any rapidity.
Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display.
the minimum and maximum times between the display of new frames (the vblank period) is exposed to the GPU via DisplayPort Adaptive-Sync. Because the minimum/maximum vblank period is known to the graphics card, successive frames will intelligently be sent within those boundaries. Predictive or speculative timing is not required under this model, and the GPU will adjust the display's refresh rate to match the current frame rate.
If an upcoming frame is delivered outside of the monitor's supported vblank period, that frame will be immediately presented on-screen when available to ensure the fastest possible screen update.
Use a forum search. I've been in just about all the FreeSync discussions since the beginning, I've seen all the arguments, and I did not make these up. These are things that people are convinced are true, but aren't.
Please post proof of them lying about freesync.
Please post proof of them lying about freesync.
He's not going to prove anything, just dance around it. All he has to do is say they are lying over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then his goal of burning the phrase "AMD is lying" into everybody's heads will be complete.
No. They are mostly exaggerations. You might have had an individual make incorrect claims. That sort of thing happens. You are definitely trying to make claims simply do you can say it isn't true. AMD never said anything like you are claiming.
A-Sync will be in every DP1.2a/DP1.3 compliant monitor
A-Sync will not add to monitor production cost, as it will be part of the normal upgrades that display manufacturers acquire from ASIC manufacturers
A-Sync will not cause monitor manufacturers to add a price premium
A-Sync will work with every GPU
For example, someone saying A-Sync is part of the DP1.2a standard. Why would a company not include it? Doesn't translate into will be in every DP1.2a DP3.0 monitor. ETC...
Nobody knows how AMD's proposition works because they haven't actually shown it demonstrating the important parts of the tech. The easiest thing to screw up would be the fallback cases for min and max refresh rate. At min refresh rate, you want to use triple buffering so you don't lose performance whenever a frame finishes during a (repeated) scanout. At max refresh, you want to use double buffering, or maybe even disable v-sync in order to maintain the low latency you get in cpu limited circumstances. Nvidia sidesteps this problem with panel self refresh.
is totally and completely not the gsync approach.Upon connecting a FreeSync-enabled monitor to a compatible AMD Radeon™ graphics card, the minimum and maximum duration between the display of new frames (the vblank period) is exposed to the GPU via DisplayPort Adaptive-Sync. Because the minimum/maximum vblank period is known to the graphics card, successive frames will intelligently be sent within those boundaries. .​
It is the desktop PC gamer. This market is dominated by Nvidia currently.
I didn't say AMD claimed all of those things (they have claimed some), I said that the forum defenders of AMD claim them. And debunking the forum-generated myths is just as important.
Use the search function. Lots of other threads on FreeSync.
And you can use the search function too. The point is not invalidated just because I don't feel the need to repost the same thing every time a newcomer shows up and doesn't understand. It's all there in the threads, go read it yourself.
yet people are for real claiming that the writting is on the wall for gsync.
This:
is totally and completely not the gsync approach.
There are people claiming intel and arm are all gonna go freesync? Seriously? What exactly are they gonna use? The new spec? Do people know what the new spec even is? Its an extension of the eDP spec for use on desktop monitors. Mobile devices already had the capability!!! It is absolutely meaningless in the context of gsync. They are not making any sense whatsoever.
The market for Gsync is not effected by the eDP capability to lower refresh rates. This is a silly attempt to muddy up and confuse everything.
Nvidia has a specific market for Gsync. It is the desktop PC gamer. This market is dominated by Nvidia currently. Freesync is nowhere near making gsync irrelevant. And the new change in the desktop DP spec will do absolutely nothing to Gsync by itself. The eDP spec has been there all this time and no one was using it to get a Gsync like experience. No one.
Why they heck would moving the eDP spec over to desktop DP all of a sudden have this massive impact. The new spec simply gives desktop the ability to do something that eDP was able to do all a long. And no one used it to make a gsync experienced. And no one is planning to try to use it on the desktop like that except AMD.
The standard change is not freesync. As a matter of fact, a lot more work has to be done to turn that standard into a gsync like experience. The eDP standard has been out all along and AMD has not released freesync on their mobile HW. This is because there is a lot more to it. The "new" ability of this spec existed in the eDP for years and AMD has yet to turn it into "freesync". When they finally do no one knows how good or not it may be. Its incredible how much twisting and manipulating is being done
He's not going to prove anything, just dance around it. All he has to do is say they are lying over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then his goal of burning the phrase "AMD is lying" into everybody's heads will be complete.
There's merit to it. Huddy, intentionally or unintentionally, has thrown out many mis-truths out there and that is a fact. AMD stated from the start that FS would require no additional hardware. Yet here I am reading that FS requires specific GPUs and monitors, with additional hardware added in and specifically, a new controller board that supports the feature on monitors.
Huddy stated that G-sync adds a frame of latency. Come on, NV isn't stupid: they wouldn't do this to a gaming panel because it would kill the very idea of a "gaming" monitor. A blatant lie. It doesn't add a frame of latency, it is a look aside buffer.
Anyway, FS as an alternative is good. Problem is, right now, G-sync has ULMB and 144 hz, and g-sync can be used on any resolution panel out there. If i'm not mistaken there is a 4k g-sync panel in the works by Acer.
I do not believe AMD can do anything to emulate ULMB or lightboost. Lightboost is patented by NV and while it was hacked to work in 2D mode (even for Radeon users) this technology was created by NV and it works very, very well, It's included with the G-sync asic (or module) built in. I have not hard of AMD having anything comparable to low motion blur modes, and this is a staple expected of gaming panels. So while AMD might have the low framerate thing covered (and we have NO IDEA if it will be as good as g-sync, and won't know until 2015), it doesn't have any comparable equivalent to lightboost or ULMB. And probably won't since both of these were created by NV.
I find it somewhat laughable that some are claiming FS will be better. Could it? Maybe. Perhaps. But let's be real: questionable unless you consider "working on AMD" to make it better, and there's the whole lightboost/ulmb thing too. And the claims out of nowhere that intel will be using FS. Uh-huh. Okay. Prove it? Fact of the matter is, low framerate areas are the only thing covered and we have no idea how well it will do that. AMD does not and probably will not have anything comparable to lightboost or ULMB, and as mentioned these are patented techs by NV.
