[Tom's Hardware] Sandy Bridge-E and X79 Platform Preview

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 10, 2006
11,715
2,012
126
^this,

but I think it has to do with AMD falling to far behinde.
AMD simply dont earn the $ to compete.

But imagine if they collaps? what happends then to competition? to enovation? to prices? If all you have is Intel?

Thats the main reason Im so pro AMD, I would hate to see a world, where the only choice you had was Intel.

They won't collapse. High end market isn't that big of a deal. As long as AMD can put out those awesome Fusion chips for the mainstream and an overclockable Bulldozer for the budget enthusiast, they'll be fine.

Also, you all forget the power of marketing. AMD will sell you 8 cores for $266! Do you have any idea how many "average joes" this will get? As someone who builds PCs for people, I am astonished at how frequently people will take a six core Phenom II over a quad core i5 because it has "six cores".
 

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
Also, you all forget the power of marketing. AMD will sell you 8 cores for $266! Do you have any idea how many "average joes" this will get? As someone who builds PCs for people, I am astonished at how frequently people will take a six core Phenom II over a quad core i5 because it has "six cores".

+1

The average computer buyer wants more cores for less $$. Period.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Highly disappointed they really nerfed the X79 platform this way. Now it is just a Z68 MB with a larger socket. I hope the prices on these watered down MBs will reflect the features included (or not included in this case).

I disagree. Aside from pcie version 3 which isn't all that useful with current cards, x79 is definitely more full featured than any other current chipset. It supports SRT , x16/x16 SLI (which p67/z68 do not), 10 6gb/s SATA Ports without an additional chip, quad channel memory, and intel rapid storage technology. So it will be the high end chipset of choice. I'm sure pcie3 will be added in later, but its really not all too useful right now.

I'm tempted to get it for dual x16 alone with next gen cards.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Then we will still be using SB CPUs for the next 5 years :(

And then Intel's revenue would collapse, as well as their stockprice and everything else that goes into their business model...Intel cannot afford to not obsolete today's technology regardless what AMD is doing in the meantime.
 

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
I disagree. Aside from pcie version 3 which isn't all that useful with current cards, x79 is definitely more full featured than any other current chipset. It supports SRT , x16/x16 SLI (which p67/z68 do not), 10 6gb/s SATA Ports without an additional chip, intel rapid storage technology. So it will be the high end chipset of choice. I'm sure pcie3 will be added in later, but its really not all too useful right now.

If you read the article, it says that PCIe 3.0, USB 3.0, SAS, and the 10 SATA III ports will all be omitted. Their board only has 2 SATA III ports, same as P67 boards. So thats why I said they are "nerfed".
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If you read the article, it says that PCIe 3.0, USB 3.0, SAS, and the 10 SATA III ports will all be omitted. Their board only has 2 SATA III ports, same as P67 boards. So thats why I said they are "nerfed".

Ah, I see. Ugh. Hopefully the boards will be priced accordingly....that news is indeed pretty disappointing.
 
Mar 10, 2006
11,715
2,012
126
And then Intel's revenue would collapse, as well as their stockprice and everything else that goes into their business model...Intel cannot afford to not obsolete today's technology regardless what AMD is doing in the meantime.

Definitely +1 here. SNB came out even when Phenom II offered no competition to first gen i7 chips...and SNB not only came out at a REALLY GREAT PRICE POINT , but it was also a huge leap microarchitecturally in both the CPU and GPU realms for Intel.
 

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
And then Intel's revenue would collapse, as well as their stockprice and everything else that goes into their business model...Intel cannot afford to not obsolete today's technology regardless what AMD is doing in the meantime.

Perhaps I do not understand the business end of things as well as others, but without competition, why would Intel stick to their yearly Tick-Tock schedule? They can simply release faster versions of an uArch to prolong their lifespan. Or they can add features like more cache, more cores, better memory support, etc. I do not think releasing new uArch every 2 years would be required at that point. Just my opinion.
 

MiRai

Member
Dec 3, 2010
159
1
91
Also, you all forget the power of marketing. AMD will sell you 8 cores for $266! Do you have any idea how many "average joes" this will get? As someone who builds PCs for people, I am astonished at how frequently people will take a six core Phenom II over a quad core i5 because it has "six cores".
QFT
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I disagree. Aside from pcie version 3 which isn't all that useful with current cards, x79 is definitely more full featured than any other current chipset.

What? Let's analyse below.

x16/x16 SLI (which p67/z68 do not),

P67/Z68 can do 16x/16x with NF200 chip. Not only that, but Asrock Gen3 boards support PCIe 3.0x.

Extreme Gen 3 has 5x PCIe 16 slots.

But also it has been shown countless times that 8x/8x is not more than 2% slower.

Tom's even did 3 articles to show this.

10 6gb/s SATA Ports without an additional chip,

False. 2x SATA6 natively only. Read the article.

quad channel memory,

Useless for desktop apps. Sandy Bridge smokes Nehalem despite a dual-channel controller. Core i7 (1st) or (2nd) generation is not memory bandwidth starved in non-workstation/server apps. This again has been shown many times.

and intel rapid storage technology.

Available on other chipsets too. Even P55 mobile chipset has this.....big deal.

So it will be the high end chipset of choice.

Only by virtue of socket LGA2011 having a 6-core SB processor. The X79 chipset itself is crippled until they add USB 3.0, way more SATA 6.0 ports and PCIe 3.0 natively. For a high-end chipset lacking so many modern features is pretty embarrassing for Intel.

I'm tempted to get it for dual x16 alone with next gen cards.

$250-300 for x16 2% performance boost? Have fun, esp. since next generation cards will support PCIe 3.0. So you are contradicting yourself. You claim that x16 is a world's of difference over x8 on PCIe 2.0, yet you claim PCIe 3.0 is not important since you are OK with PCIe 2.0 x16. Can you provide any benchmark that shows a meaningful difference between PCIe 2.0 x8 vs. x16?
 
Last edited:

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
Useless for desktop apps. Sandy Bridge smokes Nehalem despite a dual-channel controller. Core i7 (1st) or (2nd) generation is not memory bandwidth starved in non-workstation/server apps.

Comparing SB dual channel to Nehalem tri channel is not vaild due to the fact that SB has a much improved memory controller:

In Nehalem, those units have three ports, but only one can do loads, so the chip is capable of a single load per cycle. In Sandy Bridge, the load/store units are symmetric, so the chip can execute two 128-bit loads per cycle.
 

AdamK47

Lifer
Oct 9, 1999
15,782
3,606
136
:hmm:

Why would you buy the 3960X over the 3930K when they're both binned the same? The only difference is that the 3930K had some unneeded L3 cache disabled and a meager 100MHz difference, probably to serve as an excuse to jack up the price 67%.

I'll list the immediate reasons that come to mind.

1. I'm not you.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
RussianSensation: I knew about the NF200. In fact, I posted about it yesterday in the video card forum if you'd like to check here: http://forums.anandtech.com/showthread.php?t=2188801

As far as i'm aware, and I may be wrong -- only the most high end enthusiast motherboards include it. I think the MIVE may have it, and thats what? A 350$ board? And thats the ONLY board that i've heard of with it. Just about every NF200 board has been discontinued. Again, I could be wrong. You tell me, RS, do you feel a 350$ motherboard is a viable solution to get x16/x16 sli? Or would you rather for a chipset to support it natively. I'm aware of the current performance difference between dualx8 / dual x16 ( I mentioned it in the above linked post), however, i'm thinking that could well change with the next generation of cards, which is what I had in mind -- And i'd rather not buy a 400$ nf200/ x79 board to use dual x16.

Anyway, hopefully you got some great satisfaction out of tearing my post apart, if it makes you sleep better tonight. I admitted my error a few posts up after fully reading the toms article.
 
Last edited:

AdamK47

Lifer
Oct 9, 1999
15,782
3,606
136
Just looked at the gaming benchmarks. It's very evident that all three Intel platforms were GPU limited with the single GTX 580. It would be nice to see 3-way or 4-way SLI GTX 580 results.
 
Mar 10, 2006
11,715
2,012
126
:hmm:

Why would you buy the 3960X over the 3930K when they're both binned the same? The only difference is that the 3930K had some unneeded L3 cache disabled and a meager 100MHz difference, probably to serve as an excuse to jack up the price 67%.

You know, I've been reading your argument over and over again. Let me tell a little story. Back in high school, I scored a 2340/2400 on my SAT. This score is fantastic, but it was not the best possible . I spent 6 months preparing to get that 2340. I spent another 6 months practicing my tush off, retook the exam, and got my perfect score.

6 months of effort for a measly 60 points? When my old score was just fine? Yeah, for most people my score was great, but I wanted the best, and I got it.

The same thing holds for having the best CPU. Some people, myself included, want the best, no matter what the cost. Once I graduate college and get a real job, you can bet that I will be buying only extreme edition CPUs. (Until then, I buy i7 2600k price segment chips that maximize performance per dollar)
 
Last edited:

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
Does it mean I am old if I have no clue what a 2400 SAT scores means? When I was in HS the max score was 1600.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I disagree. Aside from pcie version 3 which isn't all that useful with current cards, x79 is definitely more full featured than any other current chipset. It supports SRT , x16/x16 SLI (which p67/z68 do not), 10 6gb/s SATA Ports without an additional chip, quad channel memory, and intel rapid storage technology. So it will be the high end chipset of choice. I'm sure pcie3 will be added in later, but its really not all too useful right now.

I'm tempted to get it for dual x16 alone with next gen cards.

You do realize only a Radeon HD 6990 at 2560x1600 with 4xAA saturates PCIe 2.0 X8, right? The next-gen cards that are coming out won't be close to the kind of bandwidth the 6990 pushes out, hence why the extra PCIe lanes only make sense if you want PCIe 2.0 X8 Tri-CF/SLI.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
You know, I've been reading your argument over and over again. Let me tell a little story. Back in high school, I scored a 2340/2400 on my SAT. This score is fantastic, but it was not the best possible . I spent 6 months preparing to get that 2340. I spent another 6 months practicing my tush off, retook the exam, and got my perfect score.

6 months of effort for a measly 60 points? When my old score was just fine? Yeah, for most people my score was great, but I wanted the best, and I got it.

The same thing holds for having the best CPU. Some people, myself included, want the best, no matter what the cost. Once I graduate college and get a real job, you can bet that I will be buying only extreme edition CPUs. (Until then, I buy i7 2600k price segment chips that maximize performance per dollar)

Bad analogies are a given here. The difference is that there's no difference between them both when they're OCed. That's why Tom's Hardware themselves said they're much more interested in the 3930K than the 3960X. The 3930K is just a 3960X with a 100MHz lower clock speed and some disabled cache for perceptional differences. It's not faster, which is the point of my argument.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
RussianSensation: I knew about the NF200. In fact, I posted about it yesterday in the video card forum if you'd like to check here: http://forums.anandtech.com/showthread.php?t=2188801

As far as i'm aware, and I may be wrong -- only the most high end enthusiast motherboards include it. I think the MIVE may have it, and thats what? A 350$ board? And thats the ONLY board that i've heard of with it. Just about every NF200 board has been discontinued. Again, I could be wrong. You tell me, RS, do you feel a 350$ motherboard is a viable solution to get x16/x16 sli? Or would you rather for a chipset to support it natively. I'm aware of the current performance difference between dualx8 / dual x16 ( I mentioned it in the above linked post), however, i'm thinking that could well change with the next generation of cards, which is what I had in mind -- And i'd rather not buy a 400$ nf200/ x79 board to use dual x16.

Anyway, hopefully you got some great satisfaction out of tearing my post apart, if it makes you sleep better tonight. I admitted my error a few posts up after fully reading the toms article.

Sandy Bridge motherboards with NF200 start at $250...
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
either way its $$$ and probably won't be any faster in games than a 2600k for 4 years, at which point both will be obsolete.

I hear this alot and this is what I don't get. Why is it that do most people think everyone only uses their machine for gaming?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Didn't realize AMD had any thunder left to steal.....

Which is too bad and I believe this is the reason Intel is able to put out "nerfed" SB-E chipsets and delay IB. Lack of competition only hurts the consumers.

About the only thunder left is a fart in a box at AMD.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
And then Intel's revenue would collapse, as well as their stockprice and everything else that goes into their business model...Intel cannot afford to not obsolete today's technology regardless what AMD is doing in the meantime.

People have this imaginary idea that Monopolies means seeing a reduced release schedule of new products or lack of innovation. Take a look at Microsoft. They were legally defined a monopoly in the 1990s.

We had Windows 3.1, 3.11, Windows 95, Win98, and Win98SE. The next decade saw Win2K, WinXP, Vista, and Win7.