Why does ANYONE buy the 2550K?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
You don't really understand how this works. Hyper-Threading is not something that can be broken during manufacturing.
Sure it can. If there's a defecting the silicon die area used specifically for HT, Intel can disable it and sell it as a working CPU, as I understand it. Just like a defect in the cache, can cause them to bin the chip as a Celeron.

The difference between the 2600(K) and 2500(K) is that 2MB of L3 cache, which may or may not be defective in the die, are disabled on the 2500(K). There's also the Hyper-Threading, which gives you roughly 20% more performance than not having it in heavily multi-threaded programs, but is not a part that can be damaged when manufacturing. It's disabled for product differentiation by microcode. It's basically a way for Intel to get $100 extra for something that was originally working and didn't need to be disabled in the first place.
I mean, sure, in most cased it's just intentionally disabled, but I wouldn't go stating as a fact that HT cannot be defective. Any part of the die could potentially be defective.

Like I said before, the 2550K is more "broken" than the 2500K, or 2600K. The IGP is disabled by microcode, which means you're not getting any better power consumption or lower temperatures or heat.
Again, you state this as fact, when I personally doubt it to be true. I would think that they would laser-cut the power lines to the IGP, so it should in theory take less power.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Sure it can. If there's a defecting the silicon die area used specifically for HT, Intel can disable it and sell it as a working CPU, as I understand it. Just like a defect in the cache, can cause them to bin the chip as a Celeron.


I mean, sure, in most cased it's just intentionally disabled, but I wouldn't go stating as a fact that HT cannot be defective. Any part of the die could potentially be defective.


Again, you state this as fact, when I personally doubt it to be true. I would think that they would laser-cut the power lines to the IGP, so it should in theory take less power.

Yeah, except the odds of it being defective in manufacturing are probably around 0.01% or less. There is a part on the die that enables the use of Hyper-Threading, but it's incredibly small (if I remember right it's less than 10mm^2) and therefore chances of it being damaged are statistically insignificant. It's disabled for product differentiation; nothing else. The L3 cache is another story, though. The L3 cache in Sandy Bridge takes up a significant amount of transistors and die area, so it's possible Intel disables it on some samples that have part of it that's not functional.

Also, laser-cutting the IGP isn't worth it. It drives costs up and you can still disable it via microcode. I'm 99% sure it's disabled by microcode, which is also how Intel disables Hyper-Threading in the desktop Core i5.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
If all you do is gaming then you go for the 2500K or 2550K, but to say they're in any way better than the 2600K or 2700K is completely disingenuous.

exactly, even if you didn't want HT you can always disable it on the i7s and you then also have 33% more L3 cache than the i5s
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Hyper-Threading does have a benefit, and a very big one in many programs. What you "felt" is a good example of the placebo effect at work. There's no such thing as "micro-stutter" having HT on. All HT does is make more efficient use of the processor's pipeline, giving higher performance.

Snipped the images to save scrolling time for others

If all you do is gaming then you go for the 2500K or 2550K, but to say they're in any way better than the 2600K or 2700K is completely disingenuous.

Mabye micro-stutter wasn't a good choice of word(s)

When I was testing the 2700k I came to the conclusion that hyper-threading isn't for me. Using the same clock speeds for example the system felt snappier with HT disabled. With HT the system seemed to be not as smoothe, fluid, responsive doing the things I do. At stock or very low baby clocks it wasn't noticeable. To me it just felt like the hyper-threading couldn't keep up with the real cores the faster the chip was clocked. It wasn't a placebo effect. I doubt I'm the only one who has felt this.

I didn't state that hyper-threading wasn't worth it, that it sucks, or it doesn't have any use in the real world that I remember. All I stated was that it's not for me. I didn't say that the 2550k/2500k's are better than the 2600k/2700k's :)

LOL_Wut_Axel said:
The difference between the 2600(K) and 2500(K) is that 2MB of L3 cache, which may or may not be defective in the die, are disabled on the 2500(K). There's also the Hyper-Threading, which gives you roughly 20% more performance than not having it in heavily multi-threaded programs, but is not a part that can be damaged when manufacturing. It's disabled for product differentiation by microcode. It's basically a way for Intel to get $100 extra for something that was originally working and didn't need to be disabled in the first place

This statement looses me on the whole 2550k is a defective 2500k arguement. The IGP may or may not have been defective I guess only intel would know this.

For what it's worth I could care less anyways! I'm happy with my defective 2550k :D
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
Encoding media on the IGP while gaming/sleeping. It uses teh CPU fan to cool it, so it wont generate any extra noise.

So clearly you don't encode with GPU's. I encode with my discrete GPU all the time and it never use's more than 20 to 40 % of the GPU which isn't even close to enough load to get the fan to spool up on even a power sucking demon like my GTX 480. The reason someone would buy a 2550k is because it's obviously cheaper than a 2500k, so I'm sure you can understand how it looks like you paid about $70+ dollars more than anyone who got a deal on a 2550k did for a sh t GPU and nothing else.

You feel pretty strongly about it all though and your intent with this thread is questionable.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
So clearly you don't encode with GPU's. I encode with my discrete GPU all the time and it never use's more than 20 to 40 % of the GPU which isn't even close to enough load to get the fan to spool up on even a power sucking demon like my GTX 480. The reason someone would buy a 2550k is because it's obviously cheaper than a 2500k, so I'm sure you can understand how it looks like you paid about $70+ dollars more than anyone who got a deal on a 2550k did for a sh t GPU and nothing else.

You feel pretty strongly about it all though and your intent with this thread is questionable.

I feel as strongly about my investment as you do yours. How is that wrong? I encode with my IGPU and game on my regular card, it works out great. If there was a deal on the 2550K i might get it, but it would have to be 50$ less, otherwise, on newegg its 10$ more so it makes no sense to get one.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Mabye micro-stutter wasn't a good choice of word(s)

When I was testing the 2700k I came to the conclusion that hyper-threading isn't for me. Using the same clock speeds for example the system felt snappier with HT disabled. With HT the system seemed to be not as smoothe, fluid, responsive doing the things I do. At stock or very low baby clocks it wasn't noticeable. To me it just felt like the hyper-threading couldn't keep up with the real cores the faster the chip was clocked. It wasn't a placebo effect. I doubt I'm the only one who has felt this.

Yes, it was. Yes, you are. Sorry, but that's the simple truth. Hyper-Threading enhances system response when multi-tasking, and you're claiming it does the exact opposite. That's not true.

I didn't state that hyper-threading wasn't worth it, that it sucks, or it doesn't have any use in the real world that I remember. All I stated was thatit's not for me. I didn't say that the 2550k/2500k's are better than the 2600k/2700k's :)
I think you did imply that. See here:

I only dabbled with hyper-threading for a about a month or two as I got a great deal on a 2700k so I thought I'd see what the fuss was about. Think for the most part it's over rated. Depending a on a person use tho it may be of benefit to fair also. Just for me it wasn't a good thing :)
Well, it's pretty much impossible for it [Hyper-Threading] to be a bad thing, so it has to be either a neutral or good thing.

This statement looses me on the whole 2550k is a defective 2500k arguement. The IGP may or may not have been defective I guess only intel would know this.
The IGP is defective. If it wasn't it'd simply be sold off as a 2500K. The reason the 2550K was launched was because Intel had a significant number of batches of CPUs with IGPs that were non-functioning, so they try to sell it to the public with "oh, it's clocked 100MHz higher and has the IGP disabled, so better for enthusiasts" when in fact it's negative because you get a chip with 99% chance of having defective parts and you don't get anything good from it in return because it doesn't enable lower power consumption or higher overclocking.

For what it's worth I could care less anyways! I'm happy with my defective 2550k :D
It's a good deal if you don't need the IGP and you can find it a good amount cheaper than the 2500K. Otherwise, the 2500K wins. Enjoy your chip. :thumbsup:
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
I feel as strongly about my investment as you do yours. How is that wrong? I encode with my IGPU and game on my regular card, it works out great. If there was a deal on the 2550K i might get it, but it would have to be 50$ less, otherwise, on newegg its 10$ more so it makes no sense to get one.

If I felt strongly about my investment I wouldn't overclock and if I cared about newegg price's, I'd mention them.
 

AkumaX

Lifer
Apr 20, 2000
12,648
4
81
LOL

2550k -2500k = 50k

your logic is undeniable

I feel as strongly about my investment as you do yours. How is that wrong? I encode with my IGPU and game on my regular card, it works out great. If there was a deal on the 2550K i might get it, but it would have to be 50$ less, otherwise, on newegg its 10$ more so it makes no sense to get one.

How do you (personally) encode w/ the IGP? For me, I encode using MeGUI (.avs AviSynth files), and AFAIK that's not compatible w/ QuickSync... unless you know of any QS-compatible converters that works w/ .avs files :p
 
Last edited:

Blades

Senior member
Oct 9, 1999
856
0
0
your logic is undeniable



How do you (personally) encode w/ the IGP? For me, I encode using MeGUI (.avs AviSynth files), and AFAIK that's not compatible w/ QuickSync... unless you know of any QS-compatible converters that works w/ .avs files :p

MediaCoder, the installation is a bit crappy (its better to extract the files and create a portable version).. but it works.. and its rediculously fast. There are sample programs in the media sdk that do the job just fine as well (actually, faster), as they can transcode via directshow.. there is also graphedit... it depends on what you want to do, as quicksync has built in functions like resize and crop (that are rediculously fast).. you can also encode/decode multiple streams at once.. its serious business. Like 500+fps serious. Love it.

I should note that I actually prefer 'sample_multi_transcode' from the intel program.. mediacoder is sometimes a bit hacky.. It accepts raw x264 streams among other things.. it also accepts a sort of batch/job file and outputs a perf file. I'd like to know how to access the quicksync aac encoder, but meh.. no big deal.
 
Last edited: