• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Is it really worth OC'ing the i5 2500k?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No one has ever reported any damage from doing it.

That's the problem, some did report damage. The Intel representative that used to post here also mentioned it. The limit stated is actually 1.575V (which is just Jedec 1.5 +/- the normally allowed max 5% deviation).

But lets stick to overclocking the cpu first, then we can get into ram overclocking and its questionable benefits.
 
You are doing yourself a disservice if you do not Overclock the 2500K.
You can get 15-25 more FPS just by OCing it and the RAM.
PS:Ignore that stuff about "Oh don't go over 1.5v on the RAM"
It causes no damage to volt the crap out of your RAM for Sandy Bridge to Haswell.
You can either believe all the benchers, or someone that read some false article sometime in the past.I ran ddr3 2133 on SB with 1.73V for months on end..no damage.
No one has ever reported any damage from doing it.It's been years now.

Furthermore SB runs cool, so you can get a nice, high OC on air/water.

I wouldn't go over 1.53v on air/water.

Yeah . . . not your fault, but given our exchange about the VCORE, this could be confusing. Also -- did you mean "I wouldn't go over 1.53V" with the CPU? Or with the RAM?

On the RAM, things may have changed since the DDR2 days. I remember then, corresponding with Corsair and Crucial tech-reps about voltage and warranty. Crucial was over-optimistic about what you could do with Ballistix; a lot of people were disappointed, and Crucial had a helluva RMA replacement-under-warranty expense.

I was going to post a more thorough account about my G.SKILL Ripjaws "Gerbils" -- the -GBRL model spec'd for DDR3-1600.

On my RAMs, the vDIMM spec is 1.5V. [My point about the "old days" was that this operational spec was pretty damn close to the warranty limit for those DDR2's.] G.SKILL tech support had boldly told customers that the warranty limit on all their RAM was 1.65, and that you could run the "Gerbils" at 1.60V to boost the speed to 1866 and timings 9-10-9-28.

But the 1.57V cautionary limit is Intel's -- not the RAM maker. I get DDR3-1866 10-10-10-28 rock-solid with 1.55V. So I'm debating on whether to tighten the latencies and STILL attempt running them closer to 1.57 than 1.60.

Anyway, I'm being a chatterbox this morning. I should take my "medication." :whiste:
 
Last edited:
Lately, I've been thinking of buying a cooler and slapping a 4.5 or so OC on my i5. I definitely expect to see performance gains in benchmark runs and a select few of CPU limited games. And just games in general.

Then I stumbled accross this test: http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1129&page=4

At 1080p there are no gains with the exception of skyrim. I play at 1440p and I expect gains to be even less.

So let me get some opinions from members here. Is the above test skewed or would the OC definitely help mins in games.?

100% completely worth overclocking. You should be able to hit 4.0 on stock voltage, and 4.4-4.7 with increased voltage. I could hit 4.5 without going over 70c with a Hyper 212 Push-Pull on my 2500k.

Skyrim is single thread performance bound because the engine is dual threaded only and somewhat poorly optimized for PC at that. You will see an improvement that the "average FPS" metric does not capture readily. You'll get better minimum frame rates and better frame times -- even if your Average FPS stays the same or only increases slightly it will feel smoother
 
I thought I was done posting about the 2500K back in 2011.
http://forums.anandtech.com/showthread.php?t=2200753
People are still confusing Intel's published max VID range of 1.52, for this CPU with the max vcore; or their suggested max VDDQ (processor I/O supply voltage for DDR3) of 1.575, page 82 of their datasheet. As far as I know, Intel has never formally published what the safe max vcore is for the 2500K. http://www.intel.com/content/www/us/en/processors/core/2nd-gen-core-desktop-vol-1-datasheet.html
 
I thought I was done posting about the 2500K back in 2011.
http://forums.anandtech.com/showthread.php?t=2200753
People are still confusing Intel's published max VID range of 1.52, for this CPU with the max vcore; or their suggested max VDDQ (processor I/O supply voltage for DDR3) of 1.575, page 82 of their datasheet. As far as I know, Intel has never formally published what the safe max vcore is for the 2500K. http://www.intel.com/content/www/us/en/processors/core/2nd-gen-core-desktop-vol-1-datasheet.html

Lol at the first part...it has been a while, hasn't it?

As for "safe" max vcore, I think the discussion here is just about people's experience, not any published specs. My posts are certainly from experience, and not relying on anything published. At the end of the day, the point is that the OP should OC.
 
Lol at the first part...it has been a while, hasn't it?

As for "safe" max vcore, I think the discussion here is just about people's experience, not any published specs. My posts are certainly from experience, and not relying on anything published. At the end of the day, the point is that the OP should OC.

Yes, I also think he should do it.

But . . ."people's experience . . ." That was an emerging problem -- something Intel chose not to nail down for us. It may have partially contributed to the misconception about the VID range.

I'm not a physicist, although I got through sophomore "relativity." I KNOW a physicist, though. It was his opinion that there's a limit on transistors in general -- he said it was 1.5V. But donchya see? That's my hearsay about what he recollects from his sickbed.

We came up with an informal consensus for 32nm lithography. We were probably OK with it. But a lot of people got it into their heads that you could do with a Sandy Bridge what you could do with an old Northwood -- voltage-wise.
 
I'm not really suffering in performance. The SSD and Ram are already noted for upgrade. I had just been googling and really started wondering if my i5 was worth OC'ing. And considering it is the K I thought I might well make use of the extra performance if the gains are substantial.

An overclocked 2500k was the best CPU bang for buck of the last 3 years. Whether it's worth it depends on which games you play and whether they are CPU limited. BF3/BF4/Arma2/Arma3 and a few others often are in which case it's worth it if you want significantly faster fps e.g. In Arma2 I could always tell when my i7 920 reverted to stock from 4 Ghz OC. FPS would drop from 40 to 28 or similar an almost linear decline mirroring clockspeed. That was with 2x6990's though so unless you upgrade your GPU you may well not notice a difference. If playing COD or console ports etc I doubt it's worth it.
 
Last edited:
No I don't think it's worth overclocking a 2500k or any CPU for that matter. It can cause system instabilities, data corruption, risk damaging, and reducing the lifespan of you CPU. Overclocking also adds another element to troubleshooting computer problems. Also it takes too much testing and time to test for overclocks that might or might not be stable. In my opinion if you want more CPU speed even if it's not much faster, I suggest just sell your 2500k with the motherboard and upgrade to either a 4670k or 4770k with a new motherboard as Socket 1155 does not support Haswell chips. However if you are going to lose too much money in the process then stick with what you have and maybe upgrade to a faster GPU if you don't have a fast GPU.

This post should be pinned. It is simple in it's eloquence of including pretty much every OC myth and cliche all in one paragraph.
 
No I don't think it's worth overclocking a 2500k or any CPU for that matter. It can cause system instabilities, data corruption, risk damaging, and reducing the lifespan of you CPU. Overclocking also adds another element to troubleshooting computer problems. Also it takes too much testing and time to test for overclocks that might or might not be stable. In my opinion if you want more CPU speed even if it's not much faster, I suggest just sell your 2500k with the motherboard and upgrade to either a 4670k or 4770k with a new motherboard as Socket 1155 does not support Haswell chips. However if you are going to lose too much money in the process then stick with what you have and maybe upgrade to a faster GPU if you don't have a fast GPU.

671256e55e7b94c478f77c4dd2aa2641afb98ec711bc9be66307aab25cd881fe.jpg


This post should be pinned. It is simple in it's eloquence of including pretty much every OC myth and cliche all in one paragraph.

Indeed.
 
No I don't think it's worth overclocking a 2500k or any CPU for that matter. It can cause system instabilities, data corruption, risk damaging, and reducing the lifespan of you CPU. Overclocking also adds another element to troubleshooting computer problems.


Well sure, higher voltage and higher CPU temps do reduce the lifespan of any PC component. However, we're talking about going from 30+ year lifespan down to 15-20 years if you overclock...

Now when you mentioned that it can lead to data corruption, I knew you had to be trolling.
 
Well sure, higher voltage and higher CPU temps do reduce the lifespan of any PC component. However, we're talking about going from 30+ year lifespan down to 15-20 years if you overclock...

Now when you mentioned that it can lead to data corruption, I knew you had to be trolling.

30 years? Isn't that a stretch? I seemed that for a very long time, the life expectancy of any Intel processor operating under the simple thermal guidelines for the particular model was ten years.

AigoMorla -- whose obsession with cooling here is legendary and inspiration for joking -- had posted his "best guess" about how longevity would be affected by different voltages and some exceeding specs. Sensible choices might lead to a 5 to 7 year lifespan.

But the problem there is that few people wait that long to find out. We also had been treated to some white-papers published here about transient voltages in the context of LLC settings. Also in earlier days, there was a practice of fixed-voltage overclocking. But that's all changed with the possibility you could overclock with the energy-saving features like EIST enabled.

I think it was more common in earlier processor and chipset generations that motherboard parts like capacitors would degrade, forcing the over-clocker to revise his voltage settings to either hold the same speed, or to reduce the speed to accommodate the new voltage status-quo. But even that has changed with the newer motherboards.
 
30 years? Isn't that a stretch? I seemed that for a very long time, the life expectancy of any Intel processor operating under the simple thermal guidelines for the particular model was ten years.

10 years sounds about right for 24/7 use @ 100% CPU. However, typical users, especially those who are afraid to OC, can easily stretch that time frame at least twofold.
 
10 years sounds about right for 24/7 use @ 100% CPU. However, typical users, especially those who are afraid to OC, can easily stretch that time frame at least twofold.

Well . . . . I've known people who were still using Windows 95 in 2007. It doesn't work both ways, but usually the obsolete OS persists on the obsolete computer. 😉
 
Well . . . . I've known people who were still using Windows 95 in 2007. It doesn't work both ways, but usually the obsolete OS persists on the obsolete computer. 😉

Honestly, if it would support current hardware and apps I'd still use Win2000. Based on the NT kernel, incredibly stable and robust... In fact, it was soo good I never installed XP on a machine I used at home. I rode 2000 into Vista (sadly), then 7 came out shortly thereafter, thankfully.
 
Honestly, if it would support current hardware and apps I'd still use Win2000. Based on the NT kernel, incredibly stable and robust... In fact, it was soo good I never installed XP on a machine I used at home. I rode 2000 into Vista (sadly), then 7 came out shortly thereafter, thankfully.

You're not the only one! I think I installed it in 1999, and only bought XP licenses (for mine and other family machines) -- in 2004.

Life seemed simpler before the Millennium . . .
 
Back
Top