• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Whats the CPU to get today for overclocking

Anubis

No Lifer
getting that itch again, Current rig is an I7 920 running at 4 GHz, i dont really need to upgrade, but im curious as to what the current setup for value/$$ for overclocking is now. I have not been paying attention at all for a few years
 
I'd personally say the 2600k, they have the best high end clock speeds, they're slightly slower clock for clock than the new IvyBridge but IvyBridge runs hot and so overclocking headroom tends to be somewhat lower.

If you're willing to install a good cooler I think overall the 2600k is a better choice, mines at 4.7Ghz with a 1.285 vCore and ThermalRight TRUE Spirit 140 cooler running at 60-65 degrees full load. I know for a fact I can push this to 4.8Ghz easily and probably up to 5Ghz if I'm willing to put way more vCore through it, not really worth it though.

I've read the IB chips tend to top out around 4.5Ghz
 
My 2600k can run right around those speeds, but of course the 2700k is faster. I usually keep it between 4.0 and 4.4 (I have gotten it up to 4.9), just to keep the heat and fan speed down.
 
Last edited:
I'd personally say the 2600k, they have the best high end clock speeds, they're slightly slower clock for clock than the new IvyBridge but IvyBridge runs hot and so overclocking headroom tends to be somewhat lower.

If you're willing to install a good cooler I think overall the 2600k is a better choice, mines at 4.7Ghz with a 1.285 vCore and ThermalRight TRUE Spirit 140 cooler running at 60-65 degrees full load. I know for a fact I can push this to 4.8Ghz easily and probably up to 5Ghz if I'm willing to put way more vCore through it, not really worth it though.

I've read the IB chips tend to top out around 4.5Ghz

yea id be doing the large air cooler setup, as im not into water cooling
 
The Intel® Core™ i7-2600K and Intel Core i7-2700K are both great processor but based off from what you are going to be doing with the system you may get as much value out of the Intel Core i5-2500K or the Intel Core i5-3570K. So if you are going to mainly be doing something like gaming the Intel Core i5-2500K is outstanding for performance and value. With good cooler you can get the Intel Core i5-3570K to 4.2GHz or 4.3GHz easily and it will perform about the level of the Intel Core i5-2500K at 4.6GHz. Most tests show around a 6% performance boost for the 3rd generation Intel Core processors at the same speed.
 
If saving money is the main issue, go for the sandy bridge parts which are cheaper than their IB counterparts. Although keep your eye out for deals. TigerDirect did have an i5-3570k deal just a few days ago for $179, that would make the choice a no brainer IMO. If on the other hand you want the newer gen CPUs then IB is the way to go. They do have a few benefits such as lower power consumption, faster on a per clock basis and support for PCIE 3.0.
 
i can afford basically whatever but generally look for what gives you the best bang, not a lot of good reasons to drop a grand on one of the extreme CPUs IMO.

So either a 2600k or one of the 35/37xx series. I5-I7 difference is no Hyper threading right, anything else of note? most ocers leave it on still or turn HT off? its still on on my 920 with no issues
 
actually certian programs i use do hit all 8 "cores". Vegas, & photoshop to name 2. only reason i brought it up is some people turned it off when OCing the 920s for stability, i never had that issue. not that is is an issue.

im assuming unless you do a ton of rendering or something else that will hit >4 cores its a wash with the newer stuff
 
Not aimed solely at you Anubis, but why would disabling HT make a chip overclock better? Would there be less heat involved with HT disabled?
 
Not aimed solely at you Anubis, but why would disabling HT make a chip overclock better? Would there be less heat involved with HT disabled?

Turning ON Intel's HTT allows the chip to run two threads per core, which essentially allows each core to use more of its execution units per clock cycle. More execution units running at the same time means more work is done in a given time frame, and that translates into higher heat generation.

Assuming the goal of an OC is to hit as high of a frequency as possible then yes, cutting down on heat generation helps achieve a higher overclock by minimizing the chance that temperature ends up becoming a constraint in achieving higher clocks.
 
Turning ON Intel's HTT allows the chip to run two threads per core, which essentially allows each core to use more of its execution units per clock cycle. More execution units running at the same time means more work is done in a given time frame, and that translates into higher heat generation.

Assuming the goal of an OC is to hit as high of a frequency as possible then yes, cutting down on heat generation helps achieve a higher overclock by minimizing the chance that temperature ends up becoming a constraint in achieving higher clocks.

Nice answer, makes sense.
 
Turning ON Intel's HTT allows the chip to run two threads per core, which essentially allows each core to use more of its execution units per clock cycle. More execution units running at the same time means more work is done in a given time frame, and that translates into higher heat generation.

Assuming the goal of an OC is to hit as high of a frequency as possible then yes, cutting down on heat generation helps achieve a higher overclock by minimizing the chance that temperature ends up becoming a constraint in achieving higher clocks.

It depends if you're after purely high numbers to brag or after maximum performance.

Actual performance with HT on or off can differ depending on the application, it's tempting to say performance is always better by some degree but that's not always the case, if an application is limited in the number of threads it spawns and for whatever reason it happens to run one of those threads on a HT core and not a real physical one your performance can be lower.

Game benchmarks with HT on and off show varying results, newer games tend to use HT very well and see a minor performance increase, some older games see no benefit or even a loss, so it's highly application dependent.

HT off will certainly allow for faster stable speeds though, without a doubt.
 
Turning ON Intel's HTT allows the chip to run two threads per core, which essentially allows each core to use more of its execution units per clock cycle. More execution units running at the same time means more work is done in a given time frame, and that translates into higher heat generation.

Assuming the goal of an OC is to hit as high of a frequency as possible then yes, cutting down on heat generation helps achieve a higher overclock by minimizing the chance that temperature ends up becoming a constraint in achieving higher clocks.

Doesn't this only work though if software is coded to execute in multiple threads? There is more software that uses the extra cores, but is still single threaded which = no heat increase.😕

I may be understanding it all wrong too.
 
I agree with what you guys are saying. If I recall correctly, HTT used to be a hit or a miss. Looking back on some benchmarks for Netburst era chips (Anandtech's article on this is really good), you can tell that hyperthreading was ideal in certain workloads but detrimental in others. From what I noticed, leaving HTT ON has provided more of a consistent boost across the board as Intel's microarchitecture's got wider and wider. Haswell's microarchitecture is even wider than SNB's; in my opinion, this is a clear indication that Intel consciously designs each new microarchitecture with hyperthreading in mind. The two heavy duty FMA units along with the addition of two new ports to help offload ports 0 and 1 are a testament to that.

As for multithreaded software and HTT, the scheduler should be loading up all cores first before it starts to allocate to the virtual cores, at least that is my understanding. So in theory, if the software calls for 4 threads, each core will be handling one thread. Start going over four and the additional threads will be handled by hyperthreading each physical core. For those of you who like analogies, think of Hund's Law. It is an elegant solution in my opinion. But to answer your question explicitly, yes, from my understanding, if the code doesn't force the scheduler to call for hyperthreading, the you shouldn't see additional heat. However, Windows has dozens of threads running constantly so I would assume that HTT ON geneates slightly more heat regardless.
 
Last edited:
OK, now I feel like we are discussing this with Sheldon Cooper.


LOL

Lol, I meant to say Hund's Law. Pauli Exclusion states that no two electrons share the same quantum address. Hund's Law describes the filling of shells in orbitals.
 
Lol, I meant to say Hund's Law. Pauli Exclusion states that no two electrons share the same quantum address. Hund's Law describes the filling of shells in orbitals.

The analogy is very apropo :thumbsup:

OK, now I feel like we are discussing this with Sheldon Cooper.


LOL

Cooper always came across as rather resistive when hot, but if you could get him to cool down then he'd tend to find someone else to pair up with and suddenly you couldn't hold him back not matter what unless a really magnetic person joined the crowd 😉
 
Last edited:
Back
Top