GTX-1070/G-sync v. R9 390/Freesync

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 10, 2006
11,715
2,012
126
It could hurt sales. If people start to see that an A-sync monitor is a lot cheaper, and they can buy an AMD GPU to take advantage of it, they may just do that. It'll be really helpful if AMD would get a competitive GPU at the high end soon to take advantage of this possibility.

Yeah, except back in the real world, GeForce GTX sales continue to surge...
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Thanks guys.

Here are the cards I'm considering, linked.

MSI 1070
http://tinyurl.com/zsnlw6b

MSI R9 930
http://tinyurl.com/z7qvufc

Just a few days ago the R9 was cheaper than the 1070. No idea why the price jumped so much.

what on earth??? just get a Fury or 480. That is a huge budget you have and I think you aren't making the most of it, but your money. I am sure you could find better prices than that for a 390. No way. You should take note that its not sold by newegg, but a third party and is not official pricing. Its way higher than it should cost.

Standards battle? There is no battle. Gsync is the more robust solution and the installed base that Gsync targets is much larger than the base that FreeSync targets.

OP, get a nice Gsync monitor and 1070. The monitor will last you for years and NV's execution at the high end of the gaming market should give you confidence that once you're in the NVIDIA/Gsync ecosystem, you will have compelling GPUs to buy.

You are right, there is no battle. Gsync is already dead. The vast numbers of freesync monitors have already secured it as the standard. Not to mention what he would be giving up with a g-sync monitor. They usually have just one display port connection and other limitations thanks to the module. Buying a $700 monitor with a single display port and some USB ports in 2016 is laughable.

lol at people recommending the fury cards in this thread. I guess fanboyism has no limits. Buying a 4GB card in 2016 with a big budget? LMAO!

Sent from my HTC One M9

its really not bad. A 1070 is sometimes slower than a 980ti, and fury cards regularly beat 980ti in dx12. Add that ~$100 reduced cost to buy fury x or ~$170 to buy Fury and its kind of a no brainer. cept the vram limit. For him to go 1070 and gsync would be a few hundred more dollars.

I still get a chuckle from some of the predictions that some people made about Polaris. 100W 980 Ti performance for $200 was my favorite.

actually, with OC and dx12 it should be pretty close. Not 100 W though.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Buying any monitors now is a waste of money and locks you out of new monitors with DisplayPort 1.4 & HDR support.

Also, if AMD believed so much in "standards", why not just use the standard's name Adaptive-Sync instead of making as if they invented anything with "FreeSync". Freesync is not a standard at all.

G-Sync is much more robust & proven tech and Nvidia will never support inferior VESA Adaptive-Sync that has proven to be laughable at best with AMD's poor implementations.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
actually, with OC and dx12 it should be pretty close. Not 100 W though.
What nonsense. So a couple of AMD evolved games now make the 480 "pretty close" to a 980Ti in DX12? I mean are you freaking kidding me?

What about the other DX12 games where it gets destroyed? What about DX11 games where at times even a 970 destroys a 480?

This assumption that AMD cards are a whole tier above in DX12 is getting absolutely ridiculous and totally unfounded fanboyism.

Oh and btw there is practically no OC on Polaris cards lol.



Sent from my HTC One M9
 
  • Like
Reactions: Arachnotronic

Azix

Golden Member
Apr 18, 2014
1,438
67
91
What nonsense. So a couple of AMD evolved games now make the 480 "pretty close" to a 980Ti in DX12? I mean are you freaking kidding me?

What about the other DX12 games where it gets destroyed? What about DX11 games where at times even a 970 destroys a 480?

This assumption that AMD cards are a whole tier above in DX12 is getting absolutely ridiculous and totally unfounded fanboyism.

Oh and btw there is practically no OC on Polaris cards lol.



Sent from my HTC One M9

calm down. i was simply pointing out that someone could have started that rumor with an overclocked 480 doing well in one game.

lots of users are running 480s at or over 1400Mhz. If that's no OC, w/e.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yeah, except back in the real world, GeForce GTX sales continue to surge...

If the lack of Async hurts sales, you can expect them to start supporting it. If AMD had a high end GPU a couple months ago (and still don't), I probably would have went that route due to Freesync costs. I'm sure there are others that feel the same.

When this might really start to hurt is when a lot of people have upgraded their monitors and find themselves with a Freesync capable monitor, even if they had not specifically looked for one. When those people go to upgrade their GPU, they are likely to give AMD a higher priority.

We may not be there yet, and maybe it won't happen, but it is a vary likely scenario down the road.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
calm down. i was simply pointing out that someone could have started that rumor with an overclocked 480 doing well in one game.

lots of users are running 480s at or over 1400Mhz. If that's no OC, w/e.
Those were rumors before the card was released or had any reviews. I thought you implied that those rumors weren't much far off. Sorry for the misunderstanding.

Sent from my HTC One M9
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
If the lack of Async hurts sales, you can expect them to start supporting it. If AMD had a high end GPU a couple months ago (and still don't), I probably would have went that route due to Freesync costs. I'm sure there are others that feel the same.

When this might really start to hurt is when a lot of people have upgraded their monitors and find themselves with a Freesync capable monitor, even if they had not specifically looked for one. When those people go to upgrade their GPU, they are likely to give AMD a higher priority.

We may not be there yet, and maybe it won't happen, but it is a vary likely scenario down the road.

Yeah I think this is what will happen. Freesync is going to be so common that people will end up with it and want to take advantage.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Freesync can actually do that now (or something similar to that). It's called LFC (Low Framerate Compensation). It's automatically (but only) available on all Freesync monitors where max refresh is at least 2.5 times min refresh. (PDF Source: http://www.amd.com/Documents/freesync-lfc.pdf)

It can do that, it can also do the other things all gsync monitors do by standard but the reality is most don't. That's always the counter to the "there's hundreds of freesync monitors so it must be taking over the world". The reality is there's hundreds of monitors with bad freesync implementations and very few that actually meet what you'd expect to find on every gsync monitor (min 30fps, LFC, overdrive that works while variable syncing active, etc).

If we disclude the "any freesync is better then nothing" argument and concentrate only on people who want it to work well then there are almost the same number of freesync and gysync monitors for sale today as most companies tend to make one monitor and have both variants. They are all still fairly pricey. The gsync variant will cost more but tends to include things like a better dead pixel policy so you often get more then just gsync.

Anyway (irrespective of it being Nvidia's fault) you do have to pick a side when it comes to monitors as AMD will never support gsync and realistically it's highly unlikely Nvidia will ever support freesync. Hence you buy the monitor and it will enforce your next gpu purchase or two - if you really like AMD as a company then support them! Personally I have my doubts AMD will be able to stay competitive with Nvidia (they can't compete at all in the op's price range at all) so pragmatically you are probably safer picking Nvidia, but really what chance has AMD got of recovering if even those who really want AMD to succeed don't buy their stuff...
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Yeah, except back in the real world, GeForce GTX sales continue to surge...

Back in the real world Nvidia's sales of video cards dropped by almost 30% from 9.2 million in Q1 to 6.6 million in Q2. The only thing that's surging is their ASP.
 
Feb 4, 2006
110
7
81
I think one thing this debate has proved on monitors is this tech is way too new to try and guess which one will win over in the end, if any. I'm not confident enough now in either Freesync or G-sync re: future proofing to spend any premium on either.

Probably going to go with a refurb Asus VG248QE for 200 bucks.

Here's my build so far.
http://pcpartpicker.com/list/F6pdHN
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
There is nothing wrong with buying into either, just realize it will probably dictate your next couple GPU purchases beyond this one. There is always new tech, but you can't let that prevent you from making any purchase. There are great monitors out now, even if new tech may be coming in the future.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
NVIDIA won't support Adaptive Sync in the future, their execs have been very clear that they are all in on G-Sync. After all, they've put a lot of work/effort into building it, why should they give it all up?

Situations change.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Arachnotronic said:
NVIDIA won't support Adaptive Sync in the future, their execs have been very clear that they are all in on G-Sync. After all, they've put a lot of work/effort into building it, why should they give it all up?

They'd give it up because putting FPGAs in a monitor is not at all sustainable. Either they need to design an ASIC to do Gsync or use the ASIC-less Adaptive Sync standard. Do you really think that an FPGA based version is going to be competitive in a world with Sub $200 Adaptive Sync monitors which do substantially the same thing?
 
Feb 4, 2006
110
7
81
There is nothing wrong with buying into either, just realize it will probably dictate your next couple GPU purchases beyond this one. There is always new tech, but you can't let that prevent you from making any purchase. There are great monitors out now, even if new tech may be coming in the future.

I just don't want to get caught in a situation where I'm holding a Circuit City DIVX player ya know. Especially with the premiums on these sync monitors :)
 
Mar 10, 2006
11,715
2,012
126
Situations change.

True :) If NV were seeing its business negatively impacted by not adopting Adaptivr Sync then I could see them using it. But with NV's gaming GPU revenues continuing to grow quickly and with its share/installed base of the gaming GPU market where it is, there's little reason to throw in the towel on GSync.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
They should support both Async and Gsync. No reason to give up one or the other, rather support both.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
You would get a better experience with 1070 and plain monitor than 480 with free-sync anyway, bit of a moot point.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
The OP's budget allows for a G sync monitor, interesting how you didnt recommend this. Seems almost like you want to force the OP into the AMD ecosystem by having him get a placeholder Rx 470 AMD a FreeSync monitor so that whether Vega is competitive or not, the OP is stuck buying an AMD GPU.

Ironic that you will recommend him buying a GSync monitor which forces him to only buy Nvidia GPUs, but don't recommend him buying a VESA Standard Adaptive Sync monitor which both AMD and Nvidia can support.

The only thing preventing Nvidia from supporting Adaptive Sync is their greed. They could easily support both adaptive sync and gsync.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
It can do that, it can also do the other things all gsync monitors do by standard but the reality is most don't. That's always the counter to the "there's hundreds of freesync monitors so it must be taking over the world". The reality is there's hundreds of monitors with bad freesync implementations and very few that actually meet what you'd expect to find on every gsync monitor (min 30fps, LFC, overdrive that works while variable syncing active, etc).

If we disclude the "any freesync is better then nothing" argument and concentrate only on people who want it to work well then there are almost the same number of freesync and gysync monitors for sale today as most companies tend to make one monitor and have both variants. They are all still fairly pricey. The gsync variant will cost more but tends to include things like a better dead pixel policy so you often get more then just gsync.

This again. Like you said, there are just as many high-end Freesync as G-sync options. Obviously, you get what you pay for. A $200 24" won't have the same features as a $1000 34" curved ultrawide, that doesn't make them "bad freesync implementations".
How it could ever be considered a negative that Samsung, LG, Acer, etc. use DP1.2a in all their monitors is beyond me. Even a 40-60Hz range is a nice bonus, it's a win for everyone.

But with NV's gaming GPU revenues continuing to grow quickly and with its share/installed base of the gaming GPU market where it is, there's little reason to throw in the towel on GSync.

Once again, they don't have to ditch G-sync, they can easily support both. They already seem to have people convinced that G-sync is some kind of premium implementation, hell, it actually has a couple of advantages like ULMB. They can continue to market it like that. It's really not that far-fetched they will support A-sync in the future.
 
  • Like
Reactions: Bacon1