GTX 980 Matrix SLi or GTX 980Ti for 4K monitor?

bleucharm28

Senior member
Sep 27, 2008
495
1
81
For this Asus RoG SWIFT PG27AQ 4K IPS G-SYNC monitor.


Do I need to go Ti or will GTX 980 Matrix SLi be enough?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
980Ti typically has higher minimum framerates than 980 SLI.

Typically, it's the complete opposite. If SLI works, even a max overclocked 980TI cannot even touch stock 980 SLI.

https://www.youtube.com/watch?v=EXesr7G3fxY

4K is so demanding, 980Ti barely makes a dent over the 980.
mordor-geforce.gif


980 SLI is 30% faster than Titan X:
9477


With a 1.5Ghz overclock, 980 SLI will beat 980TI @ 1.5Ghz at 4K.

In reality, maxing out settings today isn't even possible with 980Ti SLI at 4K so no matter what decision is made, some settings will need to be turned down. It's just me but I would think if someone can afford 980 SLI and a 4K monitor ($1K?), at that point why not buy 980Ti SLI?

This is a good review that gives a better idea where 980TI SLI lands in 4K gaming today:
http://www.techspot.com/review/1033-gtx-980-ti-sli-r9-fury-x-crossfire/
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
He already has 1 GTX 980 already RS.

Ya I know I was just thinking he has a $500 card and probably a $1K monitor (can't seem to find the price for it, has it been released?) and he is willing to buy another $500 980. At that point why not just go 980TI SLI and sell his existing 980? I mean his total cost with 980 SLI is $1k monitor + $1K GPUs. If he can afford to spend that much, maybe just go for the best?

980Ti 1.5Ghz is the most powerful card and 4K is uber demanding resolution so they 980Ti SLI + 4K seems like a very good fit.

http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/

That 650W SeaSonic is a bit worrying though if he wants to extract maximum value from his 980Ti SLI + overclocked i5. Could be getting very close to maximum power usage. Not saying the PSU won't handle it since it's rated at 650W but something to keep in mind.

It's tricky though because in some games at 4K, 980Ti OC really pulls away from 980 OC

Yes, these charts are 980OC (980OC SLI) and 980Ti OC (means 980Ti OC SLI)
GTAV.jpg

1-dragonage.jpg

1-witcher3.jpg

http://www.toptengamer.com/best-4k-1440p-graphics-card/
 
Last edited:

bleucharm28

Senior member
Sep 27, 2008
495
1
81
Thanks for the great info, thank you Russian sensation for more in-depth detail.

I will be switching platform to socket 2011 with the SLi setup with my existing 1200w Corsair PSU.

Or I may not go with 4K since is just stupid expensive.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I would wait for the 16nm GPUs before jumping to 4K. No single card can handle 4K at the moment, and even with high end GPUs like the Titan X in SLI, you might need to scale back settings to get good performance..

The 16nm GPUs will be much better equipped to handle 4K than current 28nm GPUs..
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Typically, it's the complete opposite. If SLI works, even a max overclocked 980TI cannot even touch stock 980 SLI.

https://www.youtube.com/watch?v=EXesr7G3fxY

4K is so demanding, 980Ti barely makes a dent over the 980.
http://techreport.com/r.x/vram/mordor-geforce.gif

980 SLI is 30% faster than Titan X:
http://cdn.sweclockers.com/artikel/diagram/9477?key=1509ecaac8cd938be51dc93991f0ced7

With a 1.5Ghz overclock, 980 SLI will beat 980TI @ 1.5Ghz at 4K.

In reality, maxing out settings today isn't even possible with 980Ti SLI at 4K so no matter what decision is made, some settings will need to be turned down. It's just me but I would think if someone can afford 980 SLI and a 4K monitor ($1K?), at that point why not buy 980Ti SLI?

This is a good review that gives a better idea where 980TI SLI lands in 4K gaming today:
http://www.techspot.com/review/1033-gtx-980-ti-sli-r9-fury-x-crossfire/

He said "minimum"
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm still of the mindset that 4k performance isn't where it needs to be yet. Unless you have 3 or 4 cards anyway. That's because I personally am not willing to drop settings to medium or lower to get the minimum framerates up. I'd really like to say otherwise and maybe next year I can.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm still of the mindset that 4k performance isn't where it needs to be yet. Unless you have 3 or 4 cards anyway. That's because I personally am not willing to drop settings to medium or lower to get the minimum framerates up. I'd really like to say otherwise and maybe next year I can.

This really depends on what you play. There are only a few games which would require you to drop it to medium, and if you don't play those, you are at high settings or better. And those games which drop you to medium still look good at medium.

4K will never be where it needs to be if you go in with the mindset that you have to play at near max settings in all games, because as long as 1080p is the primary target of dev's, some dev's will put in super high demanding settings that 4K will have to reduce.

If they can choose to stop offering high end settings so 4K can play with max settings, so can you. Just drop a few settings.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
This really depends on what you play. There are only a few games which would require you to drop it to medium, and if you don't play those, you are at high settings or better. And those games which drop you to medium still look good at medium.

4K will never be where it needs to be if you go in with the mindset that you have to play at near max settings in all games, because as long as 1080p is the primary target of dev's, some dev's will put in super high demanding settings that 4K will have to reduce.

If they can choose to stop offering high end settings so 4K can play with max settings, so can you. Just drop a few settings.

Or he could just use his current setup and be happy instead of switching over to 4K and turning down settings which he ALREADY STATED would not make him happy.

I dunno why people try to recommend people to do the exact opposite of what they want to do.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Or he could just use his current setup and be happy instead of switching over to 4K and turning down settings which he ALREADY STATED would not make him happy.

I dunno why people try to recommend people to do the exact opposite of what they want to do.

That's perfectly fine. I am not ready to switch either. The reality is, he thinks in a year or 2 it will be ready, which leads me to believe he wants to use 4K, he just isn't ready to drop settings. The reality is, he will have to drop settings when he switches to 4K unless he waits for dev's to target 4K, which isn't going to happen to at least the next gen of consoles, if not much further out.

That means, if he wants 4K, he should just go 4K and adjust settings.

I also find it annoying people acting like 4K can only be played at medium, because there are like 5 games total that would require that.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This really depends on what you play. There are only a few games which would require you to drop it to medium, and if you don't play those, you are at high settings or better. And those games which drop you to medium still look good at medium.

4K will never be where it needs to be if you go in with the mindset that you have to play at near max settings in all games, because as long as 1080p is the primary target of dev's, some dev's will put in super high demanding settings that 4K will have to reduce.

If they can choose to stop offering high end settings so 4K can play with max settings, so can you. Just drop a few settings.

I want all the settings topped out, it's that simple. The only exception is like what I did in Witcher 3 where I turned off chromatic aberration and vignette which I feel detracts from the clarity. There's not a setup at the moment outside of 3 or 4 card configurations that can achieve minimum frame rates where I want them at 4k. That's not hard to understand.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That's perfectly fine. I am not ready to switch either. The reality is, he thinks in a year or 2 it will be ready, which leads me to believe he wants to use 4K, he just isn't ready to drop settings. The reality is, he will have to drop settings when he switches to 4K unless he waits for dev's to target 4K, which isn't going to happen to at least the next gen of consoles, if not much further out.

That means, if he wants 4K, he should just go 4K and adjust settings.

I also find it annoying people acting like 4K can only be played at medium, because there are like 5 games total that would require that.

No you won't have to drop settings with the right hardware. You still don't get it. If I built a system with 4x 980s I could run 4k right now the way I want to and be happy with the performance. What we need is that performance to come from 1 or 2 cards, not 4.

See here.
Crysis3_3840x2160_PER.png

crysis3.png

MetroLL_3840x2160_PER.png

metroll.png
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
But you don't have to play at Very high, and will still have great quality visuals. Crysis 3 looks great at medium, as does Metro.

It doesn't matter though, you've made your choice. I have too, as I don't want to go 4k until it supports 85hz or better. My requirement is just not supported, yours is more of a mental block.

You'll be waiting many years before a single card will do as you wish, as the bar is always being raised by the dev's. If the dev's all of a sudden make it so you can support 4K, it was done so by just hiding those higher level settings. If they did that, you'd think it was great, but at the same time, you can't hide them from yourself and think it is acceptable.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
But you don't have to play at Very high, and will still have great quality visuals. Crysis 3 looks great at medium, as does Metro.

It doesn't matter though, you've made your choice. I have too, as I don't want to go 4k until it supports 85hz or better. My requirement is just not supported, yours is more of a mental block.

You'll be waiting many years before a single card will do as you wish, as the bar is always being raised by the dev's. If the dev's all of a sudden make it so you can support 4K, it was done so by just hiding those higher level settings. If they did that, you'd think it was great, but at the same time, you can't hide them from yourself and think it is acceptable.

You are fundamentally missing the point then if you keep saying this to him after reading his posts.Giving someone advice that fundamentally breaks one of their goals just makes no sense....

I completely agree with him, so when I go 4K gaming, it's either going to be dropping a LOT of money to make it happen, or playing older games (what I do now anyway) so I can play with all settings turned up at 4K.

You don't understand where he's coming from when he says he doesn't want to turn ANY setting down (unless it's one he doesn't actually like the IQ of) when gaming at ANY resolution. I agree with him, I won't step down from Very high to Medium on Crysis 3 at 4K.

At the bolded point, you surely can't be kidding? Are you insulting his intelligence in a completely unfounded manner? Hardware will get faster.... Soon, 1080p will be the new 720p and 4K will be the new mainstream and hardware will be fast enough to run 4K with all eye candy and soon we'll be worrying about why 8K isn't possible.

Just wow at the second bolded point, it's too hard to even begin to describe how unwarranted that statement is. Especially when cmdrdredd recognizes that hardware to do what he wants takes 4 cards now, but will take 2 cards in the future.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
But you don't have to play at Very high, and will still have great quality visuals. Crysis 3 looks great at medium, as does Metro.

It doesn't matter though, you've made your choice. I have too, as I don't want to go 4k until it supports 85hz or better. My requirement is just not supported, yours is more of a mental block.

You'll be waiting many years before a single card will do as you wish, as the bar is always being raised by the dev's. If the dev's all of a sudden make it so you can support 4K, it was done so by just hiding those higher level settings. If they did that, you'd think it was great, but at the same time, you can't hide them from yourself and think it is acceptable.

Very High and Ultra at 1080p look better than reducing them to medium and running 4k. This was discussed in another thread in the PC Gaming forum and people said the same thing you did, and I disagree as did many others. When the lighting quality, draw distance, shadow detail, object density and other effects are turned down the graphics just don't have the same quality to them. Right now it's a compromise(spend a lot of cash or turn stuff down). When the hardware catches up we won't have to make that decision anymore. It will just take time.

Not only that but 4k adoption is pitifully slow because there's not much native content and I don't consider streaming 4k to be proper content as it's compressed pretty heavily and barely looks better than 1080p when done at it's highest quality. Screen prices will come down, hardware will get faster to handle the load for games. It's a matter of time unless some other technology arrives before that happens.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
He said "minimum"

What about it? The very first thing in my reply addresses his point.

Video that shows minimums and averages for 980Ti OC vs. 980 SLi stock. Max overclocked 980Ti typically has lower minimums than 980 SLI stock, which is the complete opposite of what he mentioned.

For the second time:
https://www.youtube.com/watch?v=EXesr7G3fxY

I wish people would actually spend the time to read the reply that has all the information in there.

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-ryse.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-lot.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-bh.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-gta_v.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-dl.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-da.jpg


I am not seeing how 980Ti "typically has higher minimum frames" against 980 SLI at 4K in games with good SLI scaling.

But you don't have to play at Very high, and will still have great quality visuals. Crysis 3 looks great at medium, as does Metro.

Having 1080P and 1440P monitors, VH/Ultra at 1080P looks better to me than Medium at 1440P. That's because to me graphical settings always matters more than pixels. But let's just go with your idea that medium looks great.

Ok, by definition then if you are OK with playing with Medium settings, what happens when more demanding games come out? At 1440P VH/Ultra, you can still lower settings to Medium (you said yourself this is acceptable). What happens if you are already playing 4K a Medium, there is no room to lower settings because beyond that point, graphics deterioration is dramatic. What that means you'd start chasing pixels at the expense of actual graphics and performance.

Another thing is 27-28" isn't really a good size for a 4K monitor to use for anything else. It's going to be very uncomfortable for most people to use such high DPI on such a small screen to browse the Internet or work.

What about 34" version of Acer's G-Sync monitor?

1. Supposedly you get the benefits of 100Hz refresh rate
2. You get GSync which means at low FPS you will have smoother gaming experience.
3. You get a monitor that's superior for productivity and gives better FOV in FPS and strategy games.

4K is likely awesome on a 37-40" PC monitor but I don't see it as much of a slam dunk on a 27-28" size. Just my opinion as I also much prefer larger monitors for work, etc.

Very High and Ultra at 1080p look better than reducing them to medium and running 4k. This was discussed in another thread in the PC Gaming forum and people said the same thing you did, and I disagree as did many others. When the lighting quality, draw distance, shadow detail, object density and other effects are turned down the graphics just don't have the same quality to them.

Exactly. Take Crysis 1/3 and run it maxed out at 1280x1024 on a CRT/windows mode so that you do not get the poor LCD scaling and it will look better than the same games running medium on a 1080P monitor.

It is true that some games look almost identical without a magnifying glass going from Ultra/VHQ to HQ but most games look much worse once you go down to medium settings. Resolution alone doesn't fix inferior graphics. It can reduce the need for anti-aliasing and make textures look sharper but it won't make low graphical details look suddenly amazing. I think a lot of gamers just want to justify getting a 4K monitor, while others don't want to buy a high-quality stop-gap 1440P monitor when they are thinking well I might as well spend $200-300 more and just get 4K.

I also think some games look much better at 4K like GTA V but others don't see the same improvement. I guess the best advice is the OP could try to find a store where he can check out a 4K vs. 1440P or a 3440x1440 widescreen monitor in person for games, productivity, etc. to make a more informed decision.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Very High and Ultra at 1080p look better than reducing them to medium and running 4k. This was discussed in another thread in the PC Gaming forum and people said the same thing you did, and I disagree as did many others. When the lighting quality, draw distance, shadow detail, object density and other effects are turned down the graphics just don't have the same quality to them. Right now it's a compromise(spend a lot of cash or turn stuff down). When the hardware catches up we won't have to make that decision anymore. It will just take time.

Not only that but 4k adoption is pitifully slow because there's not much native content and I don't consider streaming 4k to be proper content as it's compressed pretty heavily and barely looks better than 1080p when done at it's highest quality. Screen prices will come down, hardware will get faster to handle the load for games. It's a matter of time unless some other technology arrives before that happens.

Well, then, if you do believe that is the case, don't expect to upgrade to 4K for many years, as dev's will continue to put in settings for pushing 1080p, which will always cause 4K to require more GPU power than a single card.

Edit: I'd also like to see this post you are talking of with IQ, as the ones I've seen didn't say what you just said. The one I recall would say that on average medium on 4K was lower, but only if you include all the games which doesn't need to be played at medium. How often do you play games that require that low of a setting? I know we like to look at benchmarks of these games, but do we all spend that much time playing those specific high demanding games?
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What about it? The very first thing in my reply addresses his point.

Video that shows minimums and averages for 980Ti OC vs. 980 SLi stock. Max overclocked 980Ti typically has lower minimums than 980 SLI stock, which is the complete opposite of what he mentioned.

For the second time:
https://www.youtube.com/watch?v=EXesr7G3fxY

I wish people would actually spend the time to read the reply that has all the information in there.

The biggest issue is you showed pages of benchmarks, none of which showed minimums. If you expected someone to read through all those pages and watch a video in the mix, you definitely overestimate how much people are willing to spend on a single post.

The absolute minimums are almost always lower in SLI/CF when there is a CPU bound setup, because when both a single and multi-GPU setup are CPU limited, the multi-GPU setup has more overhead. The multi-GPU setup has higher minimums when there is no CPU bound situation. That isn't saying that a single GPU setup doesn't spend more time at low FPS, only that the absolute minimum will generally be with the multi-GPU setup if CPU bound.

A person that plays at higher FPS, who doesn't play at maxed settings, will find they have a lower minimum with a multi-GPU setup, a person running for the highest IQ and doesn't mind fairly low FPS will find the opposite. 4K benchmarks will obviously lend itself to GPU bound setups, making minimums favoring multi-GPU's. A person playing at 4K may or may not see the same results depending on his settings.

Note: I'm not saying a single card would be best, only that in many cases, he is right about minimums in many cases. You just don't see if in 4K benchmarks often.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Well, then, if you do believe that is the case, don't expect to upgrade to 4K for many years, as dev's will continue to put in settings for pushing 1080p, which will always cause 4K to require more GPU power than a single card.

Edit: I'd also like to see this post you are talking of with IQ, as the ones I've seen didn't say what you just said. The one I recall would say that on average medium on 4K was lower, but only if you include all the games which doesn't need to be played at medium. How often do you play games that require that low of a setting? I know we like to look at benchmarks of these games, but do we all spend that much time playing those specific high demanding games?

Hardware will catch up to where you can play the games at 4k resolutions. Look how long it took before you could play games at 1080p and now that's considered standard, if not minimum for most gamers and a single $250 card can handle most games just fine that way. I'm not demanding 60fps minimums at high resolution, but averages at or below 30fps sometimes means you are getting drops in the teens and that's not acceptable to me.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Hardware will catch up to where you can play the games at 4k resolutions. Look how long it took before you could play games at 1080p and now that's considered standard, if not minimum for most.

It only became standard because Dev's targeted that resolution. When do you think Dev's will target 4K?

1080p is only able to handle "Ultra" because dev's don't put in higher end graphics. They could target 640x480 again, and 1080p would have to play at medium.

It's more likely you'll decide higher end graphics aren't worth the lower resolution, before 4K handling Ultra on all games is possible on a single card.

And it's ok that you don't want 4K, I'm mostly just trying to open your eyes to the reality of the gaming market. 4K @ultra is not possible on the most demanding games, not because of hardware, but because of the resolution the dev's target their ultra settings. At some point, you'll have to decide when 4K is worth lower settings.

I think this concept was a LOT easier when CRT's were the norm, as we could easily compare resolution to settings on the same hardware. At that point, people were very fluid with their graphical settings and resolutions. Now, people are a lot more fixated on settings, and forgot what resolutions bring.
 
Last edited: