Just got an i7-970 with a DX58SO board.

Klingenberg

Member
Oct 29, 2012
31
0
61
Bumped bclk to 150 and it currently runs at 3,75Ghz.

I don't have a lot of experience overclocking, so advice would be greatly appreciated.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Those usually clock to well over 4 GHz.

Try using a lower cpu multiplier at first to find a bclk that allows a high clock and allows you to run your memory at reasonable speed. I beliece there was also something that always has to run faster than bclk or memory or something, but I can't remember, was on s1156 myself so I have no experience.
 

Burpo

Diamond Member
Sep 10, 2013
4,223
473
126
i7-970 is 130 watt Gulftown & above 170 bclk, you may have heat problems. Still a capable CPU.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Put a GTX 1080 on it. the i7-970 pushes the card even at stock clocks.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Put a GTX 1080 on it. the i7-970 pushes the card even at stock clocks.

Maybe if you run at 1440P. 1080P is asking for serious bottlenecks with a GTX 1080, the clocks and IPC are lacking even if the threads aren't on that generation of i7.
 

spat55

Senior member
Jul 2, 2013
539
5
76
Put a GTX 1080 on it. the i7-970 pushes the card even at stock clocks.
That is utter rubbish, my GTX 1070 bottlenecks my 3570k @4.5ghz in BF 1 and many other titles will annoying stutter.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
The i7-970 does have 6 cores and 12 threads though...

Sub 4GHz and low IPC 6C/12T though...

Don't get me wrong the i7 970 was a monster in its day but time marches on and I dont think its a good idea to match it with a modern high end GPU
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Sub 4GHz and low IPC 6C/12T though...

Don't get me wrong the i7 970 was a monster in its day but time marches on and I dont think its a good idea to match it with a modern high end GPU
It probably still holds it's own against a 3570K sometimes when things get multithreaded because of the cores, the cache, the overclocking, and the triple channel DDR3?
https://www.anandtech.com/bench/product/157?vs=701
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
It probably still holds it's own against a 3570K sometimes when things get multithreaded because of the cores, the cache, the overclocking, and the triple channel DDR3?
https://www.anandtech.com/bench/product/157?vs=701

I personally wouldn't pair a 3570K (or any 4C/4T CPU) with a high end GPU either, to be fair. I wouldn't be surprised if the i7 970 actually beat the 3570K in highly threaded games like Forza 7 or AC:O, but still, the relative lack of clockspeed and IPC means its probably better suited for a GTX 1060 / RX580 class card to avoid serious bottlenecking issues, especially at 1080P.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
My X5650 runs great in Battlefield as Frostbite is very well multi threaded

And it is paired with a Radeon 290, a sensible match for this class of CPU. Would you feel comfortable pairing it with a GTX 1080 though, as someone in this thread suggested? Would be quite an imbalanced setup to me, unless you are gaming at 1440P or 4K.
 

CropDuster

Senior member
Jan 2, 2014
366
45
91
I would say a GTX 1080 probably isn't sensible for anyone unless they are gaming at 1440p+ or high refresh rates. This always goes back to a person's definition of good enough and what games they play though. IMO, X58 is still good enough, especially in well threaded games with the 6c/12t variants. I was surprised how little improvement my 5820k made in most of the games I play. Some like ArmA and IL-2 would be better off with a 5Ghz i3 I think =/
 

slashy16

Member
Mar 24, 2017
151
59
71
I just retired the same board with a 990x and I have to say it will run everything you throw at it. I only upgraded because the 8700K came out. You can sell that board and chip for mad cash on ebay so keep that in mind if you want to upgrade.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
The Nehalem architecture of Intel CPUs is actually really good at draw calls, a tad better than Haswell IIRC. If you punt up the clocks, it should hold it's own against a stock xLake, supposing there's a ton of objects on screen.
 

tarmc

Senior member
Mar 12, 2013
322
5
81
Ive benched my x5650@ 4.0 and 4790k at stock with same videocard on firestrike, they get about the same score. I dont see why you cant run 200 bclk with a lower multiplier. Maybe try at 4.0 and see how it runs?
 
  • Like
Reactions: el etro

legcramp

Golden Member
May 31, 2005
1,671
113
116
And it is paired with a Radeon 290, a sensible match for this class of CPU. Would you feel comfortable pairing it with a GTX 1080 though, as someone in this thread suggested? Would be quite an imbalanced setup to me, unless you are gaming at 1440P or 4K.

I paired a X5650 @ 4 Ghz with a 1080 Ti and that setup gave a Ryzen 1600 @ 4 Ghz with 1080 Ti a run for its money. Got rid of both systems and upgraded to a 8700K with same 1080 Ti.

The X5650 worked great with the 1080 Ti and I only wanted more power in very CPU limited games like PUBG. The Xeon setup was running a R9 290 and upgrading to the 1080 Ti for 1440p was AMAZING cpu-limited or not.

With that said, with the 8700K the FPS in general saw another huge boost if you play CPU bound games or competitive settings.
 
Last edited:
  • Like
Reactions: epsilon84

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I paired a X5650 @ 4 Ghz with a 1080 Ti and that setup gave a Ryzen 1600 @ 4 Ghz with 1080 Ti a run for its money. Got rid of both systems and upgraded to a 8700K with same 1080 Ti.

The X5650 worked great with the 1080 Ti and I only wanted more power in very CPU limited games like PUBG. The Xeon setup was running a R9 290 and upgrading to the 1080 Ti for 1440p was AMAZING cpu-limited or not.

With that said, with the 8700K the FPS in general saw another huge boost if you play CPU bound games or competitive settings.

Interesting that you say the X5650 came close to the R5 1600, I would have thought there would be a clear distinction between it and the Ryzen as it is supposed to have similar IPC to Haswell which is several generations newer than Westmere.

A 1080Ti is a huge improvement over an R9 290, especially at 1440P. Still, I can imagine your min fps would be much better on a 8700K especially on CPU bound games such as PUBG.

Do you happen to have any numbers to show the kind of improvement we can expect going from Westmere to Coffee Lake?
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
Interesting that you say the X5650 came close to the R5 1600, I would have thought there would be a clear distinction between it and the Ryzen as it is supposed to have similar IPC to Haswell which is several generations newer than Westmere.

A 1080Ti is a huge improvement over an R9 290, especially at 1440P. Still, I can imagine your min fps would be much better on a 8700K especially on CPU bound games such as PUBG.

Do you happen to have any numbers to show the kind of improvement we can expect going from Westmere to Coffee Lake?
Perf/clock similarity in applications doesn't necessarily translate into games. There are still many CPU intensive open/semi-open world games favoring Intel, like Watch Dogs 2, where Westmere and Ryzen will have similar performance.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
PUBG is a programming mess. Most of the games relies heavily on the GPU, that's what why i placed my bet on a GTX 1080. forget the IPC BS, difference from Nehalem to Skylake is nothing earth shattering. spend your money in rig pairing the weakest cpu with the most powerful GPU you can , getting the best FPS per dollar ratio in the most part of the games.
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
My FPS on 2017 AAA titles, i5 2310 plus RX580: 60 avg on The Evil Within 2, 55 average on Hellblade: Senua's Sacrifice.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
PUBG is a programming mess. Most of the games relies heavily on the GPU, that's what why i placed my bet on a GTX 1080. forget the IPC BS, difference from Nehalem to Skylake is nothing earth shattering. spend your money in rig pairing the weakest cpu with the most powerful GPU you can , getting the best FPS per dollar ratio in the most part of the games.


Difference from Nehalem to Skylake is like 50% per clock, plus Skylake (and derivatives) clocks a lot higher as well:
https://www.gamersnexus.net/guides/2980-intel-i7-930-revisit-nehalem-benchmarks-2017/page-3
i7-930-wd2.png
 
  • Like
Reactions: ZGR

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
My FPS on 2017 AAA titles, i5 2310 plus RX580: 60 avg on The Evil Within 2, 55 average on Hellblade: Senua's Sacrifice.

What is your point? Without a point of reference or comparison those numbers are meaningless. Have a look at the chart
above to see how your i5 compares to modern CPUs. We are talking about a DOUBLING of framerates from a SB i5 to a KBL i7. So scream all you want, bold whatever you want, the numbers speak for themselves and I'm not going to advocate pairing a slow CPU with a high end GPU.
 

tarmc

Senior member
Mar 12, 2013
322
5
81
The chart above shows a benchmark from a 4c8t cpu opposed to a 6c12t. Which could impact performance. Depending on title it may not make a difference at all if using a 970, 2500, 4690, etc. Price point being the biggest difference