Long-time nVidia users considering a switch

synergy.inc

Junior Member
Nov 17, 2015
7
1
0
Hey folks,

I used to be up on all the newest tech, but alas, life has dished me some different priorities. :\ So I've come to you experts to get some advise.

I recently purchased a BenQ XL2730Z 2K monitor. This was a bit overkill for my current rig (i7 2600K, GTX 560 Ti) so I'm looking for something a little more powerful to take advantage of higher resolutions. The 560 Ti has been an excellent card for the last 4 years and I've owned almost exclusively EVGA cards for more than 10 years. The monitor, however, has FreeSync. Should I even consider moving to AMD for this feature, or is it pretty much a gimmick? I'm considering between a GTX 970 or R9 390, if they'll provide good performance at 2K.

Couple other questions I have as well:

  1. My motherboard (Sabertooth P67) only runs PCI-express 2.0. Is it pointless to upgrade to a PCIe 3.0 card without first upgrading the motherboard (something I don't really want to do right now).
  2. I want to get the highest available refresh rate of 144hz, but have heard that you basically need to run at that many FPS to get that advantage. Is this true, and if so would a 970 or 390 provide that (seems unlikely)?
  3. Overall, which is the better card between these two? The specs are quite different and I'm not up on the latest terminology.
 

96Firebird

Diamond Member
Nov 8, 2010
5,736
329
126
I see no reason not to go AMD since you already have a FreeSync monitor, I've never experienced FreeSync or Gsync but I'd say it is probably more than a gimmick. The 970 and 390 are pretty much even, so the 390 will be a good match to your monitor. No need to worry about PCIe 2.0, you'll only lose low single-digit percent performance if anything.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
Well, freesync will allow you to experience tear/artifact/stutter-free gameplay when maintaining between 40-144 FPS. The difference isn't astronomical when you are producing more than 60 FPS in a title. Freesync really shines when you dip into the 40-59 FPS range. You should notice no difference between 60 and 45 FPS and therefore frame dips will not break your immersion in demanding titles.

If you have a freesync monitor already then yes, get a 390 or custom 290X. No reason to buy Nvidia really if it means you are unable to use Freesync.

EDIT:
Although, if you are thinking of dropping ~$300 on an upgrade, you might as well wait until Pascal/Arctic Island cards come out. 28nm cards are nearing their end and if you've waited this long, may as well tuff it out for a few more months.
 
Last edited:

synergy.inc

Junior Member
Nov 17, 2015
7
1
0
Thanks guys! Definitely helpful. I went with the Sapphire Nitro R9 390. Seems to have good reviews and good specs. I do worry about only 2 years warranty, but Sapphire has a good record.

Thanks also for the heads up 4K. If I were to wait for the next new thing when upgrading it would be an endless cycle! :p If it looks like the increase in performance with the new cards is incredible then I'll probably dump this new one on eBay and pick up an Arctic Island card. I also have other components that are aging, so might be rebuilding everything within the next year or so anyway.
 

kaesden

Member
Nov 10, 2015
61
2
11
I have a g-sync monitor and i can say from experience that variable refresh rate monitors are awesome. Freesync is definately something you should consider and since you've bought into the AMD ecosystem with your monitor, which won't likely be replaced anytime soon, you really should consider an AMD video card to take advantage of the technology you've bought into.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
I'm considering a switch to amd for the next gpu, because freesync screens seem to be cheaper.

Waiting for the next die shrink though, like many others.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Hey folks,

I used to be up on all the newest tech, but alas, life has dished me some different priorities. :\ So I've come to you experts to get some advise.

I recently purchased a BenQ XL2730Z 2K monitor. This was a bit overkill for my current rig (i7 2600K, GTX 560 Ti) so I'm looking for something a little more powerful to take advantage of higher resolutions. The 560 Ti has been an excellent card for the last 4 years and I've owned almost exclusively EVGA cards for more than 10 years. The monitor, however, has FreeSync. Should I even consider moving to AMD for this feature, or is it pretty much a gimmick? I'm considering between a GTX 970 or R9 390, if they'll provide good performance at 2K.

Couple other questions I have as well:

  1. My motherboard (Sabertooth P67) only runs PCI-express 2.0. Is it pointless to upgrade to a PCIe 3.0 card without first upgrading the motherboard (something I don't really want to do right now).


  1. Yes, it's pointless to upgrade for PCIe 3.0 alone. There is about a 1% difference at 1440P between PCIe 2.0 x16 and PCIe 3.0 x16:

    perfrel_2560_1440.png

    https://www.techpowerup.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/18.html

    [*]I want to get the highest available refresh rate of 144hz, but have heard that you basically need to run at that many FPS to get that advantage. Is this true, and if so would a 970 or 390 provide that (seems unlikely)?

    In most modern games at 1440P @ 144Hz, you are going to need to either turn settings down or have 980 SLI, 980Ti SLI, or similar. A single 390/970 will not hit 120-140 FPS in modern titles with maxed out settings and MSAA. If you use older games, sure. Don't worry though because the biggest benefits for most gamers come from moving form 60Hz -> 90-100Hz. You don't need 144Hz to feel the benefits of faster motion of 90-100Hz.

    [*]Overall, which is the better card between these two? The specs are quite different and I'm not up on the latest terminology.

For 1440P and since you have a FreeSync monitor, 390 is better. However, if you play specific games a lot (Anno 2205, Project CARS, WoW), then NV will have an advantage. Since you already have a FreeSync monitor, it makes sense to grab a 390 (PowerColor PCS+ for $275 or Sapphire Nitro 390 are the best 390s) and just upgrade again in 2 years to a faster 16nm HBM2 GPU.

perfrel_2560_1440.png


If you have an overclocking motherboard, grab a good CPU cooler and overclock that 2600K to 4.4-4.8Ghz.
http://www.newegg.com/Product/Produ...5&cm_re=cryorig_h7-_-9SIA4UF2DZ6565-_-Product
or this best in the $40 class Thermalright True Spirit 140:
http://www.nansgaminggear.net/ProductDetails.asp?ProductCode=TS-140-BW-RevA-TY147A
 

Mopetar

Diamond Member
Jan 31, 2011
8,382
7,481
136
There really isn't much of a difference even when using PCIe 2.0 x8 instead either. I knew it wasn't a big deal unless you were running multiple high-end cards, but I didn't think it was that small of a gap.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
There really isn't much of a difference even when using PCIe 2.0 x8 instead either. I knew it wasn't a big deal unless you were running multiple high-end cards, but I didn't think it was that small of a gap.

Ya, for current AMD cards at least. Read a rumour that TPU may have 980Ti's PCIe scaling #s at some point for a point of reference. PCIe bandwidth bottlenecks are vastly overblown online. I guess if someone has a $1000 CPU and 3-4 Titan X SLI, you want 100% of the perfomrance, not 95%, so for those gamers it matters. :D
 

TemjinGold

Diamond Member
Dec 16, 2006
3,050
65
91
OP: You never mentioned what games you play. Assuming you don't play the games that heavily favor nVidia, there's really no disadvantage to going AMD and taking advantage of Freesync.
 

synergy.inc

Junior Member
Nov 17, 2015
7
1
0
OP: You never mentioned what games you play. Assuming you don't play the games that heavily favor nVidia, there's really no disadvantage to going AMD and taking advantage of Freesync.
Yeah, I was going to post that and it slipped my mind. I play pretty much everything from Civilization to Battlefield and all sorts in between. Although finding time to game has become sparse lately, so I'm not hard core like I used to be. Also lately I've been sucked into a lower spec game, AoE - Castle Siege, which runs on freakin' phones, so not a demanding title by any means. Witcher 2 is probably the game that gives my 560 Ti the biggest run for its money. I loved Fallout 3, so will probably be picking up 4 and would like to have it be spectacular! :D

Thanks for that info, Russian! Very helpful indeed!
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Lateley it appears having faster RAM is making a bigger difference on framerates. FYI I have a 1440P monitor + MSI 390 and Freesync is amazing.
 

synergy.inc

Junior Member
Nov 17, 2015
7
1
0
Lateley it appears having faster RAM is making a bigger difference on framerates. FYI I have a 1440P monitor + MSI 390 and Freesync is amazing.
Another aging component of my system. I have 16 gigs of DDR3 1600. So it's not crappy, but not the quickest either.

I'm glad that FreeSync is a legit feature. I'll let you guys know how the Sapphire Nitro 390 works out. Guess I kind of got the cart before the horse, buying a monitor that my video card can't fully drive! D:
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
That's the beauty of Freesync. With a 390 you'll be able to crank most settings to max and the variable framerates take care of the rest. As long as you keep the minimums above 35 (or 40 in your case) it'll feel smooth as melted butter ;)
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
As for the RAM affecting framerates I'm going to do some tests myself between 1600Mhz CL9 and 2400Mhz CL10. I have both a new 970 and 390 to see how both cards react to different RAM speeds using a 4790K. I'll start a new thread and it'll take some time to do the testing but if anyone has specific game requests let me know.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
Yeah, I was going to post that and it slipped my mind. I play pretty much everything from Civilization to Battlefield and all sorts in between. Although finding time to game has become sparse lately, so I'm not hard core like I used to be. Also lately I've been sucked into a lower spec game, AoE - Castle Siege, which runs on freakin' phones, so not a demanding title by any means. Witcher 2 is probably the game that gives my 560 Ti the biggest run for its money. I loved Fallout 3, so will probably be picking up 4 and would like to have it be spectacular! :D

Thanks for that info, Russian! Very helpful indeed!

Aside from Ubersampling, my GTX 960 pretty well maxes Witcher 2 at 1080P for 60 fps no problems. A Radeon R290 may allow you to utilize the Ubersampling setting.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Madpacket said:
Lateley it appears having faster RAM is making a bigger difference on framerates.
Faster RAM only really makes sense if you're CPU bound, i.e. you got an i3 or lower. Then RAM makes quite a big of an impact. If you got i5 Sandy or above, RAM speed above 1600 Mhz makes little to no difference in gaming. Since OP has an i7 Sandy, it doesn't apply to him.

TechSpot said:
Right away I’ll say that in practical game testing – including testing with the integrated graphics – Skylake just doesn’t seem to benefit substantially from faster memory. This may change with DirectX 12, but modern games seem to be more capacity intensive than speed intensive. However, for any kind of multimedia work, memory speed becomes much more relevant.
Source
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Faster RAM only really makes sense if you're CPU bound, i.e. you got an i3 or lower. Then RAM makes quite a big of an impact. If you got i5 Sandy or above, RAM speed above 1600 Mhz makes little to no difference in gaming. Since OP has an i7 Sandy, it doesn't apply to him.

Source

It does for FO4. FO4 is extremely sensitive to memory speeds even with non-Skylake CPUs. We even see an increase on Bulldozer. Per TechSpot an i3 with DDR3-2400 blows the doors off an i7 4770K with DDR3-1333:

RAM.png


But I wouldn't spend any more $ on faster DDR3 for the OP. Just get a new GPU and then do a full blown DDR4 platform update.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
That's why I want to investigate this RAM thing further... Fallout uses an old modified engine so those results are pretty crazy.

Just a theory but perhaps these newfound speed differences are due to the greater effective memory speeds of the Xbox One / PS4.

Since primary game development has shifted to them and they have higher overall memory speeds perhaps engine designers are taking advantage of this speed and changing the way game assets etc are handled in memory?

I dunno but it's worth investigating.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
We need more professional benchmarks with single/dual/quad-channel memory, and DDR3 1600-1866 speeds to see what kind of scaling we can get. Preferably in more than 1 game. This graph doesn't say much except two things, 1333 sucks, 2400 rocks in this game. Erm... go extra mile and do it properly.

More samples please.
 
Last edited:
Feb 19, 2009
10,457
10
76
that is a 50% increase, how the hell did that happen?

My guess:

Probably because they designed it for consoles with very high bandwidth on shared vram and system memory together as the same pool.

This means direct access to all game assets at high-speeds (with caching for Xbone's edram).

On PC with assets loaded into system ram and streamed into vram, there's a bottleneck of the low system ram bandwidth.