• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Pairing OCed I7-920 with new GPU for 4k

Right, so this is what happened, I picked up a 4k ASUS PB287Q and quickly realized that my I7-3610QM + 7970M DTR is not going to cut it (for one, its HDMI 1.0 and DP 1.1 do not allow for the transmission of 60@4k even if league of legends was running smooth at 30 fps with everything turned up, who knew ports would be my Achilles heel).

So, I have this old rig that I didn't kept up to date due to RL but is now back in use as follows:

Intel Core I7 920 D0 @ 4 Ghz (may head into used Xeon E5620 westmeres for cheap if they came up) + Thermalright Ultra 120 Extreme with push/pull config

EVGA X58 ATX LGA1366

OCZ Gold 12 GB DDR3-1600 (sad OCZ as a company died but...)

Corsair HX1000

HAF 932 full tower

5870

I want to last until at least when gen 2 of the new dx12 cards are out (with baked in 8 GB vram), intel releases the new LGA1366/2011-3 socket replacement / merge the two lines to end the madness, DDR4 is mature and not at launch price, and windows 10 is matured a bit to do a full rebuild reusing the SDD/HDDs, the PSU, the case and may go water cooling this time around.

I want to be able to drive 4k for CURRENT (not future) games at highish settings (maybe lower shadows to mid or something like that). With no or little AA. I play diablo 3, wow, league, titanfall, cs:go, cities:skyline, eve online and random others. I know civ at 4k may be an issue, or any other CPU heavy games, but for normal GPU bound stuff hopefully they can fly.

I see people matching their OCed 920s with 970s, which works wonders at 1080 or 2k, but given the whole 3.5+0.5 issue, it will likely run into issues at 4k. Do people do this because the 970 is maxing out EVERYTHING the 920s can offer or is it more because people are not trying to run 4k.

I am eying the new 300 line from AMD, the 390X, but I wonder if this thing can actually drive such a beast. Titan X is on the table if it wasn't another 1k card that is meant as a cheap quadro with full DP math enabled rather than a max end gaming card.

I could CF/SLI, but I think for this a single card would be less headace and more resellable when I do my full gut and re-fresh.

Anyone has any idea of the pairing? If you have a I7920 @~4 + GTX970 / 980 / 290X can you report back if the GPU is being throttled? If so how much?

Thanks.
 
ur processor is enough.Note that higher Resolution is mostly depended on GPU horse power so i suggest u go for R9 290X.
 
For 4K i would wait for the 390 if you want a single card or i would get 2x R9 290 (2x $250).

Civ Beyond Earth would be ok even with a single R9 290. You can play at High settings instead of Ultra.

72496.png
 
I think you'll be fine. I ran my 290 with my X5650 for a year before I bought a new X99 system, and in gaming I can't tell the difference between the two outside of benchmarks. X58 is still a beast especially with the 6 core Xeons.
 
+1 vote for getting a 6 core 32nm Xeon for peanuts to put in there, they overclock quite well too. Then again, a 4ghz D0 is no slouch, it would probably be fine as-is.

At 4k the 290 and 290x close the gap and lead respectively over the 970. Crossfire 290s is the very good value, easily the best value you can get in high-end graphics. If you want to stay single GPU, wait until the GM200 980Ti/1080/whatever they call it and the 390(x) are out, which is supposed to be very soon. 2x290 = $540, significantly faster than a single 980 for the same price.

Civ 5 BE has a special Mantle-only split frame render crossfire mode that significantly decreases frame times (e.g. perceived as very smooth framerate) that works with the 290(x) and 295x2
 
Last edited:
Hmm I will wait for the 390x then, that was the plan anyways since I am this close to their launch.

The titan X seems to be another DP monster for compute and not so much a gaming monster so I will likely pass on it.

So you guys think I could even CF / SLI these? I mean would the thing feed them properly?

Thanks for all the answer, I want to ride out the socket 2011 and into the new high end socket or for intel to unify it all again.
 
The Titan X is gutted for DP in favor of gaming, but how it fares in gaming against the 390X is up for debate, it's likely best to wait for pricing reasons and to make sure you get the right one even if you end up going with a Titan X.
 
Hmm I will wait for the 390x then, that was the plan anyways since I am this close to their launch.

The titan X seems to be another DP monster for compute and not so much a gaming monster so I will likely pass on it.

So you guys think I could even CF / SLI these? I mean would the thing feed them properly?

Thanks for all the answer, I want to ride out the socket 2011 and into the new high end socket or for intel to unify it all again.

Not sure where you got that info. The Titan X is 100% gaming, no compute...
 
I used a 4ghz 920 on a 980 for about two months at 1920x1080, and then moved to a 4790K. I didn't notice much difference in FPS games in general. There were some specific maps or areas in a few games like Crysis that were noticeably faster, but overall it wasn't a big deal, and the difference will be even less at 4K. Some compute work I do is much faster now though, and multiplayer and strategy games will benefit more as well.
 
Yeah Titan X doesn't have the DP throughput. It does still have 12gb of vRAM if you do CUDA stuff that needs lots of RAM
 
hmm guess I was mistaken on that, but yeah time to look an see the 390x and how it performs

if it was just a few % off and for a lot cheaper then i don't see me going for the titan x.
 
You're still ok with the 920, but I upgraded from a 980X to a 4790K and get a lot smoother perf in everything, so it's still worth it IMO Then again with DX12 being more efficient newer games might just be fine on more CPUs. Less brute force needed?
 
Back
Top