• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Getting a 780 ti and 4k monitor

You should know about mst and the possible headaches with the tech. Also, be prepared to have displayport issues off and on like failing to wake on sleep and cold boots. A single 780ti is only going to net you medium settings in any kind of demanding game.
 
According to benchmarks I could play at max with only 35-50 FPS. Getting the Samsung 590D and a PNY 780 Ti.

You honestly should just get a 27" 1440p monitor for gaming. Those cheap 4K monitors don't have great picture quality, and the resolution is too high for the size. Furthermore, 35-50fps is not great. I've used my 780 Ti on my 4K television, and it struggles in a lot of games. No reason to choke the 780 Ti on the desktop when you can have a better 1440p screen for less.
 
No single card can drive the majority of existing games at 4k, maybe the hypothetical GTX Titan II, but for now SLI is the minimum for decent framerates. I wouldn't drive 4K with a 780ti unless the only games you plan to run are 4X, MMOs, or RTS games.
 
You honestly should just get a 27" 1440p monitor for gaming. Those cheap 4K monitors don't have great picture quality, and the resolution is too high for the size. Furthermore, 35-50fps is not great. I've used my 780 Ti on my 4K television, and it struggles in a lot of games. No reason to choke the 780 Ti on the desktop when you can have a better 1440p screen for less.

I really agree with this, it's just too small for the resolution. 1440 at 27 inches is good, 1600p at 30 is good, but having seen a 28 inch 4k. It's not ideal, especially since Win 8.1 is still struggling with DPI scaling for a number of reasons. But the end result if blurry / weird looking text / images in apps that don't have DPI scaling natively, which is the vast majority of them. Also text in games will be small at times.

Better bet is to wait on a 32 - 34 inch 4k screen, or just get a 1440p-1600p. I do believe asus was planning to release a 32 inch 4k panel, as were a few other mfgrs. I really think 4k is better suited towards > 30 inch screen sizes.
 
Last edited:
I'd rather have a 28 inch 4k display vs a 32 inch. Higher pixel density. And I am getting another 780 ti in a few months.
 
High PPI is fine with proper DPI scaling.

First of all it's your choice and you should do whatever you wanna do. But that's the crux of the problem right there, DPI scaling on high PPI displays.

Win7/8 is not like mobile devices with iOS / android / OSX which can properly scale high PPI images/text, it's a mixed bag in Windows 8.1. Windows won't scale it ideally, by the time Windows 8.1 does the DPI scaling the vast majority of apps will have blurry images, and by that point the point of high PPI is what? Nothing. So you either get text so small that you squint, or you get DPI scaling that is garbage that makes the text blurry. BTW, anandtech had a writeup a few months back on the Windows DPI situation.

In the end though, it's up to you. IMO, wasted money. Your call and your money though, so whatever floats your boat. I'm stubborn too. I have to figure these things out on firsthand experience. 😉 Heck, even 1440p on a 27 is kinda squinty / borderline legibility in terms of text at times.
 
Last edited:
Also I am not using the monitor for browsing I have my other monitors for that. The 4k is going to be used strictly for gaming and movies.
 
Also I am not using the monitor for browsing I have my other monitors for that. The 4k is going to be used strictly for gaming and movies.

for movies? there is very little 4k content out there. Like others have said, performance won't be that great for games with a single 780ti and high/ultra settings.

Just so you can take my words with a little confidence... I have/had a dell 4k screen and 2 780 ti's. I have since moved on.
 
you dont "need" 2 cards if you are not picky about having the best settings or staying above 60 fps. I am too picky so 4k is not even a consideration for me at this point. plus the horrible Windows scaling would kill it for me for everyday use.
 
you dont "need" 2 cards if you are not picky about having the best settings or staying above 60 fps. I am too picky so 4k is not even a consideration for me at this point. plus the horrible Windows scaling would kill it for me for everyday use.

But max settings with AA at 1440/1600P will look better than medium settings at 4K, so what's the point? 4K gaming only makes sense with dual flagship cards right now like 290s or 780s.
 
But max settings with AA at 1440/1600P will look better than medium settings at 4K, so what's the point? 4K gaming only makes sense with dual flagship cards right now like 290s or 780s.
if you looked at ALL games out there then there are actually only a few that you would need to run on medium. even most of the demanding games will be ok on high if 40-45 fps is ok.

and I am not saying to get 4k and in fact said I personally would not do it. some people dont care about about 60 fps or the best settings though. heck in Metro LL you cant even see the difference between high and very high yet very high runs about 70 fps while high runs about 120 fps at 1080 with plain 780. really even medium looks just fine in that game.
 
You need to take into account of future games released within the next few years. 4K on a single card isn't going to be feasible until Pascal/AMD equivalent.
 
Considered waiting for 800 series? This is a bad time to buy high-end GPUs.

I'm no longer convinced of that.
I don't see the 880 comparing with the TI.
For the first time ever, I have seen several deals for a 780ti south of $600.
If you want the TI, I am convinced now possibly is the best time.
 
Most 4K monitors I've seen do 30fps and the picture quality is crap. 1600 might be a better investment.

The 780 is fine for 30fps at 4K but if you want 60fps look more to the R9 290.

Either way as others have pointed our it's your money.
 
if you looked at ALL games out there then there are actually only a few that you would need to run on medium. even most of the demanding games will be ok on high if 40-45 fps is ok.

and I am not saying to get 4k and in fact said I personally would not do it. some people dont care about about 60 fps or the best settings though. heck in Metro LL you cant even see the difference between high and very high yet very high runs about 70 fps while high runs about 120 fps at 1080 with plain 780. really even medium looks just fine in that game.

Agreed. Now I will re-state again that 28 inches is way too small for 4k, but if you're doing 4k, nothing is forcing you to do max settings. Fact of the matter is if you're using presets in games, 99% of games look identical at high as they do on ultra. I have played games @ 1600p trying to spot differences. And my 20/20 vision just can't find differences. I'm sure they're there if you have a microscope, but you won't ever notice it. Use a lower preset, use 2X MSAA or FXAA instead of the ultra preset with 8X MSAA or SSAA. Boom, there you've probably gained 40-50 fps with no visual differences in image quality. Unless you have microscopic vision.

Furthermore, most ultra presets are retarded and apply silly settings such as SSAA. SSAA kills performance by 60-80% depending on the quality and is never worth it. But that's the ultra preset in witcher 2 and I believe Tomb Raider. There are other stupid settings too that are included in ultra presets that simply aren't needed. But SSAA sticks out in my mind as the worst offender.

But it's another situation of do whatever the heck you wanna do with your money. If someone wants 3 GPUs for ultra @ 4k , they should knock themselves out. But you do not need 3 GPUs if you use your head and use sensible settings.
 
Last edited:
actually SSAA is not standard in Tomb Raiders's ultra pre set or even the ultimate pre set. even in ultimate the shadows are normal not ultra which is odd. TressFX though is standard on ultimate which seems odd as it should be a completely separate setting IMO.
 
Furthermore, most ultra presets are retarded and apply silly settings such as SSAA. SSAA kills performance by 60-80% depending on the quality and is never worth it. But that's the ultra preset in witcher 2 and I believe Tomb Raider. There are other stupid settings too that are included in ultra presets that simply aren't needed. But SSAA sticks out in my mind as the worst offender.

I don't want to go off topic but have a look between Lara's right arm and her body during the Tomb Raider benchmark. It looks messed up with FXAA but much nicer with 2xSSAA.

Also I think the Ubersampling in Witcher 2 does a bit more than Just AA. Normal Anisotropic filtering makes things like the puddles float above the ground so it must have its own technique for that. I've also noticed a big difference in some of the lighting.
 
Back
Top