[Tom's Hardware] AMD FreeSync Versus Nvidia G-Sync

Status
Not open for further replies.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
http://www.tomshardware.com/reviews/amd-freesync-versus-nvidia-g-sync-reader-event,4246.html

Tom's hardware did a blind test to see what the masses prefer between FreeSync Versus G-sync. While not perfect, and with a small sample size, it is an interesting read.

Summing The Analysis
We started down the path of comparing AMD’s FreeSync technology to Nvidia’s G-Sync in the hopes that a community-based, hands-on approach would provide additional insight or perhaps lead us to some unexpected conclusion. In some ways it has. But we naturally had some hypotheses going into our event, and those largely proved true, too.

Let’s get the big question out of the way: what’s better, G-Sync or FreeSync? As both technologies stand right now, the G-Sync ecosystem is more mature. From compatible graphics cards to G-Sync-capable displays and variable refresh ranges, Nvidia has the leg up. It also has an advantage when you drop below the variable refresh range. Our experiment never took us to that point, fortunately, so it didn't become an issue we needed to address in the analysis.

Then again, you’ll also pay $150 more for the G-Sync monitor we tested today—a premium that exceeds what most of our respondents claimed they’d be willing to spend for the experience they ultimately preferred, and some of those folks even picked AMD’s less expensive hardware combination as their favorite.

...
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Not a surprise.

freesync-or-gsync.png
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
So they test with different monitors meaning this becomes a monitor test. Yet they avoid testing bouncing off min variable refresh, which is an important factor in which variable refresh works better.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So they test with different monitors meaning this becomes a monitor test. Yet they avoid testing bouncing off min variable refresh, which is an important factor in which variable refresh works better.

I think they were trying to compare them at their best to see if people noticed a difference. People still choose Nvidia, even without their biggest advantage in play. Though to be honest, I'd never play a game at sub 40 FPS, so I wouldn't really care much about seeing that advantage.

The other part which I didn't think about at first, was they used different hardware. They used a 970 for G-sync and a 390x for Freesync. No matter what you did, there will be some variables that aren't perfect.

I'd like to see a larger sample size, and include all the things the Nvidia rep suggested, but it is a start and it was worthwhile none the less.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
It's not a fair test unless you build out the systems so the costs are exactly equal. That means if a G Sync monitor costs $100 more, then you must subtract $100 from the GPU budget. If you did this, I seriously doubt the Nvidia system would come out looking as good.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's not a fair test unless you build out the systems so the costs are exactly equal. That means if a G Sync monitor costs $100 more, then you must subtract $100 from the GPU budget. If you did this, I seriously doubt the Nvidia system would come out looking as good.

I completely disagree here. It's not about costs, just simply what tech is preferred, but if it makes you feel better, they did ask what people were willing to pay for the advantage. They also spent an extra $80 on the AMD GPU.
 

kawi6rr

Senior member
Oct 17, 2013
567
156
116
It's not a fair test unless you build out the systems so the costs are exactly equal. That means if a G Sync monitor costs $100 more, then you must subtract $100 from the GPU budget. If you did this, I seriously doubt the Nvidia system would come out looking as good.

I totally agree because when it comes down to it people in general will look at the extra cost of the G Sync monitor and $150 dollars is a lot of money to most people. I would rather spend that extra $150 on a stronger chip or graphics card.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,597
6,075
136
It's not a fair test unless you build out the systems so the costs are exactly equal. That means if a G Sync monitor costs $100 more, then you must subtract $100 from the GPU budget. If you did this, I seriously doubt the Nvidia system would come out looking as good.

To play devil's advocate, people looking into 1440p high/variable refresh rate solutions and beyond are already <1% of gamers. It is doubtful that they are on strict budgets and spending a $100-200 extra isn't going to make or break a build.

On the other hand, bang for the buck still applies, so it is a very fair point that G Sync costs significantly more. And judging by how many of the 1440p high refresh rate crowd uses "Korean" monitors, I foresee Freesync-capable Korean monitors becoming common rather quickly.

Edit: Apparently, at least one 4K panel with Freesync support already exists:
http://www.overclock.net/t/1554580/got-a-wasabi-mango-4k-42-korean-ips-monitor-what-tests-to-run/550
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Don't forget that this is a "Tech" demo, not a build demo. Build demos need to have cost taken into account. "Tech" demos don't need costs taken into account. They serve very different purposes.

But they were aware of cost differences, which is why they asked what people would be willing to pay for that advantage. Going off the limited test, about half were (slightly less), and half were not (slightly more). The cost difference between the setups was $120, just so you are aware.
 
Last edited:

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
Yea, you cannot at all exclude cost. I went with Freesync simply because of cost (between card + monitor) it adds to a considerable amount. YOU (AT) are the 1% saying cost doesn't matter. I don't feel I need to pay a premium for something that nothing but sli/crossfire/tri can even touch at this point. You can't compare tech that are on completely different price levels and call it a tech demo exclusively. Cost plays a HUGE role in satisfaction. Quality control alone seems to be an issue with all this 'top end' gear lately (I can't say I myself have experienced any issues though).

Having said that - if Freesync doesn't mature in the future, I may flip to G-Sync. Remains to be seen yet. You get what you pay for still can apply here.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You compare costs if you want to see what you can build with X money, and compare the results.

You do NOT compare costs if you want to compare performance for two products.

Why is it so wrong to find out what tech is preferred on a tech level only?
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Hard to conclusively test. Gsync screens all behave more or less the same (when it comes to response times and vrr range) because they all use the module, with freesync that's all up to the chosen hardware.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Owning both (Rog Swift and BenQ XL2730z) it is really hard to tell the difference unless the game goes out of the VRR range which is rare either way(below 40hz or above 144hz). Outside VRR range, tearing is the main issue for FS vs Gsync. For above 144hz on FS I just set a frame limit of 143. Maybe if monitors were in the same room I maybe would have a preference. I probably would not have bought the Rog Swifts if I knew AMD would actually deliver as the lack of inputs is a hinderance (though newer g-sync modules now support other inputs starting with Asus PG279Q) and of course $100+ premium for branding is stupid.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
and of course $100+ premium for branding is stupid.
Probably mostly the fpga on the gsync module. Those don't come cheap afaik.

Not sure why there are no cheap freesync screens yet, a $250 screen could be a gamechanger.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Nvidia should have an ASIC by now in replacement of fpga. Maybe means there waiting to see market on Adaptive sync monitors and hopefully will adopt A-sync in near future.
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
IMO, if it was suppose to be a freesynch vs gsynch battle, they shouldn't have used a Freesynch monitor that maxed out at 90 Hz variable refresh against a 144 Hz Gsync monitor and then left vsync off. In this test the AMD system would have an obvious disadvantage and show much more tearing due to the choice of monitors.

They made it a quasi monitor/vrr test rather than a comparison between the technologies. I would have rather that they used the TN panels they had available with the BenQ Freesync monitor which goes up to 144 Hz and then used the Witcher 3 to have a sub 40 Hz setup which is where the Gsync technology has the advantage rather than a monitor comparison. That would have been a much more valid an interesting test, IMO. To me, the only really good data that came out of the test is how much more consumers are willing to spend to get a better vrr experience. I give Toms props for the effort though, it was an enormous undertaking.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
IMO, if it was suppose to be a freesynch vs gsynch battle, they shouldn't have used a Freesynch monitor that maxed out at 90 Hz variable refresh against a 144 Hz Gsync monitor and then left vsync off. In this test the AMD system would have an obvious disadvantage and show much more tearing due to the choice of monitors.

They made it a quasi monitor/vrr test rather than a comparison between the technologies. I would have rather that they used the TN panels they had available with the BenQ Freesync monitor which goes up to 144 Hz and then used the Witcher 3 to have a sub 40 Hz setup which is where the Gsync technology has the advantage rather than a monitor comparison. That would have been a much more valid an interesting test, IMO. To me, the only really good data that came out of the test is how much more consumers are willing to spend to get a better vrr experience. I give Toms props for the effort though, it was an enormous undertaking.

They mentioned that, though they only had 1 game which went outside the range, and when all tests were within the range, it was still pretty much the same. That said, it would have been better to compare monitors that were the same range, and same display panel.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
One thing I have to wonder is if the differences were more about the Freesync/G-sync differences, or could the GPU's themselves made noticeable differences. Could people have preferred the 970 over the 390x?
 
Feb 19, 2009
10,457
10
76
So they test with different monitors meaning this becomes a monitor test. Yet they avoid testing bouncing off min variable refresh, which is an important factor in which variable refresh works better.

The only negative with FS is the higher min hz mark where its functional, typically 40 to 45 whereas GSYNC is to 30. It is a big deal and a game-breaker for myself because I know games can and will drop to the low 30s and that's the entire point of getting a Sync monitor.

But yeah that test is invalid if they don't showcase the full range of the sync technology, rather, it's just a panel test.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The only negative with FS is the higher min hz mark where its functional, typically 40 to 45 whereas GSYNC is to 30. It is a big deal and a game-breaker for myself because I know games can and will drop to the low 30s and that's the entire point of getting a Sync monitor.

But yeah that test is invalid if they don't showcase the full range of the sync technology, rather, it's just a panel test.

Both solutions have issues in different spots. Linus actually did scientific testing of input lag.

There is a big difference in input lag between GSync/FreeSync with VSync On/Off and it also depends on the FPS.

45 FPS
Crysis_3_1440_P_45_fps_No_VSync.jpg

Crysis_3_1440_P_45_fps_VSync.jpg


144 FPS
Crysis_3_1440_P_No_VSync.jpg

Crysis_3_1440_P_VSync.jpg


200 FPS
Crysis_3_1440_P_200_fps_No_VSync.jpg


For high speed gaming, FreeSync has a massive advantage in input lag. If playing at ~ 45 fps with VSync on, FreeSync wins again.

The question is are you the type of gamer who would leave VSync on or off when playing a game at 35-75 fps? If you leave it on, FreeSync is actually better but it's not how the test systems were configured in this study.

The testing methodology used at Tom's Hardware:

"I counted more than 180 responses on Facebook before siding with the majority and making the call that we&#8217;d leave v-sync off"

Based on Linus' testing, leaving VSync On or Off at different FPS has a dramatic impact on input lag between FreeSync and GSync (see graphs above):

https://www.youtube.com/watch?v=MzHxhjcE0eQ

There is also the case that in the real world GSync costs significantly more which makes the test a bit trivial. Sometimes the price premium for a GSync monitor is high enough to jump 1-2 tiers of GPU horsepower which means someone building a new system could move from a GTX970 to a Fury.

Also, there is a way to get a sub-$1000 4K monitor with FreeSync in larger sizes, something I haven't seen as even a possibility right now for GSync.

For example, for someone who desires for the industry to adopt open standards and for a unified adaptive sync standard, they are more likely to support FreeSync out of principle. Similarly someone who has a limited budget towards their rig might choose FreeSync and either pocket the difference on other things or step up to a Core i7 or a faster GPU with the money saved. Someone who only buys NV will only consider Gsync in the first place, even if GSync were inferior. Therefore, in the real world all of these factors make choosing between FreeSync vs. GSync different than a random blind study suggests.

I look forward to more professional / scientific testing of various GSync/FreeSync monitor to help us better make an informed buying decision.

-------------

Did some of you guys read the article though?

"Let's start with Nvidia's commentary, provided by Tom Petersen, director of technical marketing and one of our attendees.

The side by side blind testing technique is a great way to get some direct feedback from gamers about new technologies. Unfortunately, it is also the case that whenever someone knows what we are testing for they are biased to some extent which could inadvertently impact their testing and feedback.

There are a few techniques that can help mitigate this inherent expectation bias:

1. Double blind studies &#8211; the test administrator and the testers should not know what is being tested.

a. Don&#8217;t tell the gamers the purpose of the evaluation &#8211; knowledge of this being a G-Sync vs. FreeSync could impact results.

b. Use volunteers to run the test flow to eliminate the risk of administrators passing along test information

2. Include a control group with nothing new. In this case I would have used one of the monitors in &#8220;fixed refresh rate mode.&#8221;

3. Increase the sample size. This may be very difficult in practice, but more data is definitely better when science is involved.

Overall I enjoyed the opportunity to engage with THG&#8217;s community. I look forward to seeing the results."

Also, the testing participants were clearly NV-biased.

are-you-a-fan.png
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
The only issue is the lack of support for adaptive sync from nvidia. Personally G-sync is a complete no way in hell. That is a hard vendor lock. Never.

The thing about freesync is that its open and a standard for manufacturers to implement. It seems AMD driver support on their end is good and the final experience is just down to the monitor. Evaluations of that won't be representative of the entire range and definitely not going for the future. Months ago freesync might have looked worse than now because of the available implementations. Later it might look better.

To me it's the only option and internal comparisons of monitors has more value.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Both solutions have issues in different spots. Linus actually did scientific testing of input lag.

There is a big difference in input lag between GSync/FreeSync with VSync On/Off and it also depends on the FPS.

45 FPS
Crysis_3_1440_P_45_fps_No_VSync.jpg

Crysis_3_1440_P_45_fps_VSync.jpg


144 FPS
Crysis_3_1440_P_No_VSync.jpg

Crysis_3_1440_P_VSync.jpg


200 FPS
Crysis_3_1440_P_200_fps_No_VSync.jpg


For high speed gaming, FreeSync has a massive advantage in input lag. If playing at ~ 45 fps with VSync on, FreeSync wins again.

Still, the tests are done on different monitors but it's clear that GSync vs. FreeSync isn't as simple as just running a couple games randomly since such a test wouldn't account for all the scenarios of gamer might experience - i.e., 45 fps, 60 fps, 90 fps, 120-140 fps and 200 fps (competitive gamer), etc.

https://www.youtube.com/watch?v=MzHxhjcE0eQ

There is also the case that in the real world GSync costs significantly more which makes the test a bit trivial. Sometimes the price premium for a GSync monitor is high enough to jump 1-2 tiers of GPU horsepower which means someone building a new system could move from a GTX970 to a Fury.

Also, there is a way to get a sub-$1000 4K monitor with FreeSync in larger sizes, something I haven't seen as even a possibility right now for GSync.

For example, for someone who desires for the industry to adopt open standards and for a unified adaptive sync standard, they are more likely to support FreeSync out of principle. Similarly someone who has a limited budget towards their rig might choose FreeSync and either pocket the difference on other things or step up to a Core i7 or a faster GPU with the money saved. Someone who only buys NV will only consider Gsync in the first place, even if GSync were inferior. Therefore, in the real world all of these factors make choosing between FreeSync vs. GSync different than a random blind study suggests.

I look forward to more professional / scientific testing of various GSync/FreeSync monitor to help us better make an informed buying decision.

I think the only case of a "massive advantage" is your first chart. 45fps with vsync off, nvidia has a major advantage.
Perhaps the bar graph scale confused you, the bottom graph is in increments of 5 and the top is 20. So, 10ms difference looks larger than 31ms difference in the top graph.

Also, nvidia doesnt recommend gsync with vsync, I dont think the 45fps with vsync on makes any sense. Why buy a gsync monitor if you are gonna turn of vsync, that defeats the purpose.

At +144fps, on a 144hz gsync monitor, gsync starts having the same latency as vsync on.

I think these test arent showing us anything unexpected, the results arent all that interesting.

The OP is much more interesting because they try to keep things outside of ranges we already know give advantages. It is comparing them at their best, not a setup for their worst.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Reading the article, I am amazed the OP didn't mention some key details:

"Four of those who mentioned Crysis preferred their experience on AMD&#8217;s hardware, eight chose Nvidia&#8217;s and one said the two technologies were of equal quality, though his Nvidia-based platform did stutter during an intense sequence."

> This calls into question the validity of the testing where some participants would consider experience similar but the NV-based platform stuttered more?!

"Borderlands turned out to be a gimmie for Nvidia since the AMD setups were destined to either tear (if we left v-sync off) or stutter/lag (if we switched v-sync on)."

> Ok so this isn't really a test of GSync vs. FreeSync but how well optimized a GameWorks title is to run on AMD vs. NV. Figures! But yet, it's a data point that contributes to showing that GSync is better than FreeSync? That makes no sense sense.

This next one is downright shocking:

"Right next to them, we had another AMD machine at Ultra settings and an Nvidia box dialed down to the High preset. Again, three respondents picked AMD&#8217;s hardware. Seven went with Nvidia, while two said they were of equal quality"

> This right now calls into question the experience/quality of gamers they have chosen where 7 people picked NV and 2 said equal despite BF4 running on Ultra on AMD and High on NV.

bf4-amdultra-nvidia-high.png


Not only are the frame rates not comparable in this test, but the IQ isn't either.

Also, how did 10 of 48 participants admitted to knowing which test rig had which hardware but this is supposed to be a blind study?

which-machine-is-which.png


I bet if we were present there we could probably figure out based on which AMD/NV participants would show more attention to which gaming rigs!

I don't know but considering this chart, the study should have been done ONLY with brand agnostic PC gamers to make it fair.

are-you-a-fan.png
 
Status
Not open for further replies.