Ah. I don't know if that is true or not. It probably is true. Though the name FreeSync around here seems to mean free of charge, but it might have been named that, because it frees the display from a fixed refresh rate. I had noticed the possible double meaning from the start, but wasn't sure if it was intended or not by AMD (not sure they've said why the named it as such).Was referring to the irony in the name Freesync if they had to charge for it in any way.
Ah. I don't know if that is true or not. It probably is true. Though the name FreeSync around here seems to mean free of charge, but it might have been named that, because it frees the display from a fixed refresh rate. I had noticed the possible double meaning from the start, but wasn't sure if it was intended or not by AMD (not sure they've said why the named it as such).
I'd bet the person who came up with it, saw it as a double meaning, and thought it was clever.I guess you could look at it both ways but I'm guessing is based off the first not the latter.
FreeSync
- Frees the display from a fixed refresh rate
- Open source so people might think it's Free
Once again AMD makes gaming history – with the world’s first virtually “zero-cost” technology to enable perfectly smooth gameplay with no costly proprietary hardware, royalties, or licensing costs. That means monitors with Radeon FreeSync™ technology deliver perfectly smooth gaming for up to $300 less than comparable displays with competing technologies. So even on a budget, you have more cash for what really matters: FPS!
GSync came first and was noted as an expensive addon board that was later built into Gsync displays. I remember some grumbling about the cost of it when first released. So Freesync name could be a response to that, indicating it was based on free open source or a low cost alternative to Gsync. In essence, I believe AMD chose Freesync name to score marketing points vs the expensive GSync alternative.I guess you could look at it both ways but I'm guessing is based off the first not the latter.
FreeSync
- Frees the display from a fixed refresh rate
- Open source so people might think it's Free
GSync came first and was noted as an expensive addon board that was later built into Gsync displays. I remember some grumbling about the cost of it when first released. So Freesync name could be a response to that, indicating it was based on free open source or a low cost alternative to Gsync. In essence, I believe AMD chose Freesync name to score marketing points vs the expensive GSync alternative.
Adaptive Sync is the open standard. Freesync is AMD propriety name for their method of using A-sync with their hardware/software.
I didn't say that AMD charges you for their proprietary system. A-sync is free, and open standard, which allows any GPU to use, if the GPU can use it. FreeSync is AMD's system for using A-sync on their GPU with their drivers. I have no idea if they charge for putting their name on a product, but FreeSync is not an open standard. A-sync is.So why are freesync monitors so much cheaper than g-sync monitors? I just bought a g-sync monitor and it was several hundred more than a comparative freesync one. How does AMD get away with charging for an open industry standard? Which is what you are saying they are doing. I understand nvidia charging for g-sync because I think that requires a hardware module. I am actually really curious as to how AMD charge for using an open standard.
I don't know if you have much familiarity with the way industry standards, IETF RFCs, W3c standards etc. work but I cannot start charging people to use DNLA, HTML or x.509.
Seems a little off to me.
I didn't say that AMD charges you for their proprietary system. A-sync is free, and open standard, which allows any GPU to use, if the GPU can use it. FreeSync is AMD's system for using A-sync on their GPU with their drivers. I have no idea if they charge for putting their name on a product, but FreeSync is not an open standard. A-sync is.
G-sync requires them to put more than just a DP connection on the monitor, it also requires a chip, so they charge the manufacturers for those chips, as well as over see the quality of the monitors to make sure they perform to their standards. They may even charge for that service as well (I do not know).
I already said they don't charge for A-sync. They could charge for FreeSync to be placed on a monitor, but I don't know if they do or not.Well. Whatever, maybe AMD charge a nominal fee to use the FreeSync branding but I very much doubt they can charge for adaptive sync whether they re-brand it or not. I am in the process of selling my freesync monitor anyway. So it's moot for me.
I already said they don't charge for A-sync. They could charge for FreeSync to be placed on a monitor, but I don't know if they do or not.
If you are not using a 4K monitor, any stuttering you get will mostly be due to the lack of CPU power, or simply that the game is CPU demanding or poorly coded. That and SLI can add stuttering issues on its own. Gsync isn't a magical stutter fix, but it does allow for you to avoid tearing without the added latency, or potential stutter as a result of not maintaining a solid 60/120/144 FPS (depending on refresh rate). It doesn't fix standard FPS drops/stutters due to CPU demand or poor coding.So continuing on apparently for FreeSync 2 AMD are introducing their own (I assume) proprietary module wich will provide Low FrameRate Compensation. Which isn't part of the adaptive sync standard. It clearly isn't in either AMDs or nvidias interest to make it part of the standard either. So when the fps goes outside of the FreeSync range it will do exactly what the g-sync module does now. I would just like to add that from my anecdotal experience neither freesync nor g-sync are miraculous technologies.
I am still seeing stuttering using g-sync. Running two 1080tis OCed (@2000mhz) operating in SLI (far cry 3/4). I got stuttering using two Fury X in crossfire in those games as well. Although I found I got smoother game play using one Fury X at least in far cry 3 but I suspect that was because it got much lower but consistent frame rates whereas my 1080ti(s) go all over the place on frame rate even on one card. Yes I have tried a single card and I still get stuttering.
Additionally. I do have g-sync enabled and set up correctly.
To add to that, Freesync must do some
If you are not using a 4K monitor, any stuttering you get will mostly be due to the lack of CPU power, or simply that the game is CPU demanding or poorly coded. That and SLI can add stuttering issues on its own. Gsync isn't a magical stutter fix, but it does allow for you to avoid tearing without the added latency, or potential stutter as a result of not maintaining a solid 60/120/144 FPS (depending on refresh rate). It doesn't fix standard FPS drops/stutters due to CPU demand or poor coding.
I'd suspect you'd have less stutter in Farcry 3 with 1 GTX 1080ti. The game is not very demanding for 1 GTX 1080ti, much less 2 and the game has always had multi GPU issues.
I get just as much stuttering with a single card but g-sync gets touted as a magic fix. Especially by reviewers and it really isn't. There are a lot of factors. I think people need to be given more realistic expectations of the technology. That goes for FreeSync too. I know it's not the CPU. That's chugging along @ 4.2Ghz (boost/XFR) and my RAM is OCed. However that having been said I posted in another thread that I get buttery smooth performance with dishonored 2.
That was getting horrible tearing on my non g-sync monitor. Now the tearing is gone completely. I just get sick of all the fanboyism around this shit. Facts are better. Anecdotal though they may be. I can tell you there is no difference between the limitations of freesync and g-sync that I can see.
BTW: Fary cry 3 is extremely demanding on Ultra even with MSAA 2x. Just not in all cases. I assume you actually have the experience of playing with a gtx 1080ti @ 3440x1440 on those settings yourself?
I should probably add that I am running software to monitor cpu and card frequency/temps while I game as well as an FPS counter.
Oh and the rig I was running the Fury X cards on is 2. in my sig. I also find it curious that some people feel the need to defend g-syncs honour. Who gives a fuck? It's just another technology that's sold as a miracle cure but it isn't. Personally I can accept that.
I just spent 1300 bucks on a g-sync monitor knowing full well I was still going to have issues with some games.
I haven't played at 3440x1440. I have played Far Cry 3 with SLI, and I know that game has all kinds of issues that are bigger with SLI. You should compare your FPS with a single 1080ti to 2, you likely will find 1 will do better most the time.
Anyway, Gsync and FreeSync are better. They fix a lot of issues, but they don't fix all issues. If a game has inherent issues, Gsync doesn't fix it. Gsync fixes a specific problem. That problem is the stutter you get when you use V-sync (to prevent tearing), when you fail to maintain the FPS that matches your refresh rate. And to add to that, if you prevent your FPS from reaching your refresh rate, it'll also remove the added latency that Vsync can add. If a game engine has stuttering issues, they don't fix it.
Most dips in FPS are problems on the CPU side. Don't kid yourself in to believing that your CPU is immune to issues. No CPU is immune. You can blame the game engine if you like, but it only takes 1 core on your CPU to hold things up and cause stuttering. Some stuttering is related to drivers on the GPU side. You can usually spot them by watching GPU usage. When the GPU usage drops, that's almost always a CPU/RAM issue, or loading of a new area.
Did you even read my post or anyone elses on the topic? I have not seen anyone here say g-sync or freesync fixes all stutter. All it does is fix a particular cause of stutter. If the game itself has no stutter issues, with your current CPU, then G-sync and Freesync won't cause stutter and remove any stuttering you get as a result of turning on V-sync. No one else has claimed any different.Yeah. I do know what adaptive sync does and there are no issues with my CPU performance. You can kid yourself there are no issues with g-sync if you like. My point is it is marketed like it's a miracle (the freesync or g-sync implementation of adaptive sync) when in reality it isn't. I read a recent freesync review for example where the reviewer claimed that you will never get stuttering or tearing again with this screen. Because it has freesync.
I have read similar g-sync reviews. People should be aware that it doesn't work like that. That's all. This thread is full of rampant fanboyism and a total lack of facts and like I said I spent 1300 bucks on a g-sync monitor this week but I know what g-sync does and I wasn't expecting perfection and I didn't get it.
You are going to get a lot of people who spend that sort of money and will expect a miracle but they aint going to get one.
Did you even read my post or anyone elses on the topic? I have not seen anyone here say g-sync or freesync fixes all stutter. All it does is fix a particular cause of stutter. If the game itself has no stutter issues, with your current CPU, then G-sync and Freesync won't cause stutter and remove any stuttering you get as a result of turning on V-sync. No one else has claimed any different.
You seem to be the only one confused here.
My personal experience with FreeSync/Gsync is that it makes ~45 FPS feel good instead of just tolerable, 60 FPS somehow smoother still, and starts to fade in usefulness around 90 FPS. All in all, since most games I play I would run at settings to get me from 45-90fps, it's a great feature. I just wish Gsync wasnt so expensive now that I'm on an nVidia card
You only talked about fanboyism (to me at least) in the more recent posts. You keep changing the topic every time I respond to you.No. You are still missing my point completely but that isn't surprising is it?
It's the fanboyism. There is only so much nvidia fanboyism I can handle. Even though I am rocking a full nvidia setup now including g-sync.
Reading this thread it becomes clear that most posters have zero experience with either version of adaptive sync and are just here to fanboy nvidia and it shits me. I don't know why it's actually really irrational.
Caveat: I have only read a few pages but I got the gist of it.
I was just trying to point out that, 1) FreeSync and G-sync does work, 2) no one talking about what it does is being a fanboy, and 3) no one else was saying they fix everything. That seemed to be his own misconception.you two do understand that you've been echoing each other post after post, and that quoting and responding to each other is not necessarily arguing, don't you?