I'll probably run to Micro Center and pick something up, just don't want to shoot myself in the foot by getting something that won't work.used or new?
Used: I think a GTX 1070 should handle it easily.
New: probably any of them.
Although DisplayPort 1.2 HBR2 can easily accommodate resolutions up to 4096x2160 @ 60Hz, the AMD Radeon™ HD 6800 series GPUs are designed to support up to 4096x2160 @ 50Hz.
The 6800 has 2xDVI, 1 DP, 1 HDMI. It was defaulting to 60Hz. I had a Radeon R7 240 in a computer hooked up to my TV and it would 4k on the TV but only 2560x1080 with the new monitor... so, not sure what's going on. A friend is sending me a card he had in a mining rig that is a few years old so I'm sure it will do all the higher resolutions but I don't know why the R7 won't do 3440x1440.Does the monitor use HDMI or display port? What is the minimum acceptable refresh rate?
Generally anything that does HDMI2 or display port from the last... (decade?) and can set custom resolutions should work?
Considering only nVidia for a moment, GTX950, GT 1030 and newer generations should be capable of merely getting the UWQHD resolution at 60Hz for office work.
However I would look further into what the HD 6800 can do. If it has display port 1.2, then it might be able to do UWQHD using a passive DP to HDMI2 cable and the right driver? I did see a puzzling statement that might be relevant:
Being too lazy to do the bandwidth math, nor knowing the driver limitations, HD6800 might not support over 30Hz refresh on it's HDMI 1.4 port, or a crippled driver without custom resolution/refresh-rate settings might only show the 1080p at a higher refresh rate and 24bpp color or higher.
Anyway the cheapest card I'd get for this res. office work would be an nVidia GT1030 used on ebay around $25. I'm only generalizing, random other more powerful cards could cost no more used than some over-priced used GT 1030's... the video card market hasn't been right in a long time.
Also with the older generation cards you'll run out of driver vs OS support at some point.
Did it just not show the 3440x1440 choice to select, or did you also try to set up a custom resolution and then got a blank screen or some error message?The 6800 has 2xDVI, 1 DP, 1 HDMI. It was defaulting to 60Hz. I had a Radeon R7 240 in a computer hooked up to my TV and it would 4k on the TV but only 2560x1080 with the new monitor... so, not sure what's going on. A friend is sending me a card he had in a mining rig that is a few years old so I'm sure it will do all the higher resolutions but I don't know why the R7 won't do 3440x1440.
I'm going to try using a new cable and see it that might be the problem...
I think you are right about it being a timing issue with these old cards. AMD's specs show that the 6800 will do 4k but only at 50Hz and only over DP. This new monitor is rated up to 120 but may not support something as low as 50. New card should be here Monday, so then we'll know...For non-gaming use I'd still consider looking for a used GT 1030 on ebay, or GTX950, etc, one of the x030 or x050 series cards newer than 750TI, any with HDMI2, for smaller size, lower power draw, less case heat, etc. Some of the 1030 are passively cooled too, IIRC.
Depending on # of hours you'd run that mining card per year, the lower draw of a 1030 could pay for itself in power bill savings. Then again they can be underclocked, undervolted to achieve a significant drop and possibly keep those fans lasting longer.
