Does LGA2011 have a future beyond Sandy Bridge-E?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
We know that, but there will be on LGA1356, which is a slight variation on 1366.

I've never heard of socket 1356. What is that?


RussianSensation: yes, I know you can disable GPU from BIOS, or even the HT if you hate it. What I meant is paying extra for the silicon for that GPU which I won't use.

I understand, but for X79 motherboard, you'll also be paying extra for 14 SATA ports, quad-channel memory support and SAS ports that you probably won't use either.
 
Mar 10, 2006
11,715
2,012
126
RussianSensation, Socket 1356 is the "successor" to LGA 1366 in the workstation arena. 95W TDP CPUs, up to 8 cores, triple channel memory, 24 PCI-E 3.0 lanes. Basically budget LGA 2011.
 

Axonn

Senior member
Oct 14, 2008
216
0
0
RussianSensation: exactly. Hence, I shall not go for it. I'll use Z68 and that's it. Sandy 2600K, or, if I get really angry, 2500 standard. This is because I'll be getting my computer only in December, and that's awfully close to Ivy Bridge. I don't think I need to get a good Sandy. Just something to last me until I go Ivy. A 2500 would do just fine.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
RussianSensation, Socket 1356 is the "successor" to LGA 1366 in the workstation arena. 95W TDP CPUs, up to 8 cores, triple channel memory, 24 PCI-E 3.0 lanes. Basically budget LGA 2011.

I believe Intel will not launch this platform and they will only release socket 2011.
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
1
71
This isn't even conjecture. It is just some random hypothesis with absolutely no backing.

true, but then If I could see into 2014 and get the answer for you, I would stop off and get some stock market tips as well.

I see it as just as likly as saying that the 2014 replacement for IB WILL have DDR4 only.
 

pantsaregood

Senior member
Feb 13, 2011
993
37
91
Placing an extra memory controller on a CPU is a horrible business decision. It drives production costs up. AMD had no reason for putting a DDR2 controller on Phenom II-based CPUs other than saving face on the AM2+ socket. AM2+ was extremely short-lived. If they hadn't allowed some sort of backwards compatibility, it would have further damaged their reputation.
 

pantsaregood

Senior member
Feb 13, 2011
993
37
91
Socket A lasted from 2000 to late 2004. A lost of Socket A boards supported 333 MT/s FSB, so you could realistically move from a 600 MHz Duron to an Athlon XP 3200+.

That said, the 333 MT/s FSB 3200+ was an oddity.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Which reminds me how disappointed I was when I found what's going to be the socket lifespan of Bulldozer: ZERO. AM3 then bye bye.

Lame...........

Not everyone really cares if they can upgrade their CPU on the same socket. I am pretty sure in 2-3 years from now, we'll want a brand new mobo for Bulldozer Next / Haswell anyway (PCIe 3.0, SATA-Express, Thunderbolt, other cool features, etc.). The most important thing is that BD architecture is good from performance / watt, multi-threaded performance per module, and instructions per clock / per core performance. This architecture will be with us for a while, whether or not AM3+ will be replaced by FM+ or some other socket is of little concern really.


Socket A lasted from 2000 to late 2004. A lost of Socket A boards supported 333 MT/s FSB, so you could realistically move from a 600 MHz Duron to an Athlon XP 3200+.

That said, the 333 MT/s FSB 3200+ was an oddity.

This is where people fall into the socket trap. Socket A may have lasted for that long but motherboards sure as hell didn't! I know I had an Athlon XP1600+.

First Via 266 got upgraded to 266A, then we got Via333 and Via400. So ya, along the way your 266 motherboard got stuck at XP2700+. After that if you wanted to get that magical Barton 2500+ to 3200+ speeds you needed a 400 FSB capable board. So if you had a Via333 board, you were out without overclocking. And then there was the fact that the Athlon platform needed at minimum a 1:1 CPU:Ram ratio (or at least as far as I remember). As such, I am pretty sure you needed fast enough Ram (i.e., when XP+ first debuted, most Ram was still DDR266 and over time transitioned to DDR400). Therefore, by the time FSB400 chips arrived, your ram from 2 years ago wasn't fast enough to enable those speeds. And then there were additional features brought along the way such as nForce 2 boards with awesome onboard sound. It was a lot more complicated than you make it sound.

Same with Socket 775. You are missing key transitions such as SATAI --> II, AGP to PCIe, etc. In other words, sure the motherboards were all Socket 775, but the features on the boards continued to evolve making older boards completely obsolete.

Personally, I would rather sell my CPU+Mobo+Ram and keep upgrading to better parts every 2-3 years. I also find it's a lot easier to sell these 3 components in a combo since many users with older systems want a quick swap upgrade. I have been reselling the parts for at least 5 years and I find that it has worked far better for me to either sell the package or the entire system than on a piece by piece basis. I find I always get very low offers when I am just trying to sell the CPU or the motherboard. So from a resale value, it also works out better to get rid of the mobo + CPU (and even Ram) at once (at least in my experience).
 
Last edited:

Tuna-Fish

Golden Member
Mar 4, 2011
1,661
2,520
136
Placing an extra memory controller on a CPU is a horrible business decision. It drives production costs up. AMD had no reason for putting a DDR2 controller on Phenom II-based CPUs other than saving face on the AM2+ socket. AM2+ was extremely short-lived. If they hadn't allowed some sort of backwards compatibility, it would have further damaged their reputation.

I don't think AM3 cpu's so much had two memory controllers, as they had a memory controller that could work with both ddr2 and ddr3. AMD was in on the committee that standardized ddr3, and made sure it was close enough to ddr2 that this was technically feasible.
 

PreferLinux

Senior member
Dec 29, 2010
420
0
0
RussianSensation: yes, I know you can disable GPU from BIOS, or even the HT if you hate it. What I meant is paying extra for the silicon for that GPU which I won't use. But, as others pointed out, LGA 2011 doesn't make sense for quad core. And hence I definitely do NOT need (or want to pay for) 6 or 8 cores...

PreferLinux: The 4 core is not a K series? Bah! I just realized that right now! Just checked Wiki. Pffffff. Screw this. I'm back to 2600K, period.

greenhawk: you're probably right that the GPU doesn't cost anything in terms of silicon. This reminds me of the junk Intel are doing with the "upgrade cards". Somebody should sue them.
Maybe they should sue us for overclocking? Or all the computer stores that will overclock your system for you if you pay them?


Correct, plans for the 1356 were scrapped.
I believe Intel will not launch this platform and they will only release socket 2011.
It was scrapped for the desktop, but where do you get that it was completely scrapped?
 

Axonn

Senior member
Oct 14, 2008
216
0
0
RussianSensation: I do care about upgrading my CPU on the same socket. Let me tell you why.

While indeed for my new machine I will almost definitely NEVER use the same socket, I DO NOT throw away my computers. I donate them, most of the time to somebody close to me. And when that somebody is VERY close, then I also upgrade the computer a bit.

The fact that my current Core 2 Duo can STILL BE UPGRADED, even now after FIVE DAMN YEARS, is awesome. I can actually put a nice Core 2 Quad in this mobo, at a very low price, and practically push the computer forward from 2007 where it currently is, to about 2009 ::- D. For a very low cost too. Which is cool.

With Bulldozer, there won't be any of that. AMD doesn't keep producing old models for too long because they don't have the factories to do so. If they would have kept at least Bulldozer gen 2 on the same socket, I would have been a happy camper. But like this... bah!

PreferLinux: let's give Intel some time ;;- ). I'm sure they can come up with something. Especially if AMD continues to limp away into the shadow. Damn why doesn't IBM or Samsumg or whoever-the-frack buy them? AMD is a nice company. They could be even nicer with some good investor behind them. Oh well...
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
RussianSensation: I do care about upgrading my CPU on the same socket. Let me tell you why.

While indeed for my new machine I will almost definitely NEVER use the same socket, I DO NOT throw away my computers. I donate them, most of the time to somebody close to me. And when that somebody is VERY close, then I also upgrade the computer a bit.

The fact that my current Core 2 Duo can STILL BE UPGRADED, even now after FIVE DAMN YEARS, is awesome. I can actually put a nice Core 2 Quad in this mobo, at a very low price, and practically push the computer forward from 2007 where it currently is, to about 2009 ::- D. For a very low cost too. Which is cool.

With Bulldozer, there won't be any of that. AMD doesn't keep producing old models for too long because they don't have the factories to do so. If they would have kept at least Bulldozer gen 2 on the same socket, I would have been a happy camper. But like this... bah!

PreferLinux: let's give Intel some time ;;- ). I'm sure they can come up with something. Especially if AMD continues to limp away into the shadow. Damn why doesn't IBM or Samsumg or whoever-the-frack buy them? AMD is a nice company. They could be even nicer with some good investor behind them. Oh well...

Platforms that can be upgraded for many years are the exception, not the rule, and even with next gen cpu's working old boards there's always the issue of minor incompatibility/bios issues, etc..

As for your second point, I don't think AMD is hurting. Their GPU's are powering all of the next gen consoles so I don't think they're going anywhere soon - although I do hope that BD is a worthy product. Certainly intel having viable competition would not be a bad thing.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Ivybridge was fiest only designed for s2011 and then intel decided to make them on 1155 also.2011 will definitely get ivybridge and could possibly support haswell.

Yeah, we can thank our buddies at AMD for that one. If BD had been less late/slow/sucky then intel wouldn't have been able to get away with splitting the platforms like this.
 

PreferLinux

Senior member
Dec 29, 2010
420
0
0
Have you seen any Intel road map with socket 1356 as of late (both desktop or Server) ??
No, but I do know that the LGA 1356 and Sandy Bridge Wikipedia pages still show it as of yesterday, both having been edited the day before or so.