Rumor: Intel to delay releasing Ivy Bridge

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
It does seem backwards. It has appeared to me that Intel almost wants to piss enthusiasts off with their recent decisions. The 990X is much more expensive, but the 2600K chases it in most benchmarks, the original SB chipsets made you choose between OCing and video out...

Intel is proving just how twisted things can and will continue to get so long as they hold the performance crown and customers are willing to support it.

It wouldn't work if people weren't willing to buy into it. But they do, and it is.

I too yearn for the simpler days where we had a nice linear product lineup and it was easy to identify the top-of-the-line and know where everything fell in its shadow.

Not so anymore. We, the consumers, get the products we deserve. Just wait until until the ram guys follow suite, and the GPU guys, and the harddrive guys. You won't know this year's products from last year's from next year's when this is all said and done.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Core i7s can be anything between a 2-core/4-thread chip and, soon, an 8-core/16-thread chip. Sitting right next to each other at Best Buy. It's insanity.

I can't count the number of times I've read people say "OMG IT HAS AN i7" when referring to a 1.3ghz dual core arrandale CULV.

Hell, now with "9xx" referring to a chipset, a Phenom II X4 and Nehalem Core i7 and Westmere Core i7, I just don't even bother anymore.

I think with the market slowing down, the idea is to be able to keep all your old crap floating around store shelves without people being able to differentiate it from the new stuff. Incredibly cynical.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Well, I am fairly certain Intel confirmed that they will allow BLCK overlclocking on lga2011. And if it was not officially confirmed, it is one of the most agreed upon rumor going.
An Intel Engineer (SB team, Haifa, Israel) has told me ,and i quote: "Some operations on the Ring Bus simply cannot work with BCLK over 100MHz."

so for now i tend to think O/Cing will remain multiplier only, but what do i know...
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
An Intel Engineer (SB team, Haifa, Israel) has told me ,and i quote: "Some operations on the Ring Bus simply cannot work with BCLK over 100MHz."

so for now i tend to think O/Cing will remain multiplier only, but what do i know...

It's good to know there was a reason at least. I wonder if there might be some sort of work-around? I kinda assumed 2011 would have BCLK OC'ing...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
It's good to know there was a reason at least. I wonder if there might be some sort of work-around? I kinda assumed 2011 would have BCLK OC'ing...

Given the multiple clock domains as well as multiple voltage domains involved, the analog components in the circuits necessary for making the SoC parts work could very well be limited to sub-130MHz operations in a way that traditional CMOS scaling does not readily circumvent.

It could also have something to do with the fact that Intel forwent dynamic CMOS in favor of static CMOS techniques when they released Nehalem.

dominologic.jpg
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
I can't claim to fully understand that, but it is clear to me that Intel is taking power consumption very seriously. I really can't complain about that. Thanks IDC!

Just think, in 10~20 years, we might have the equivalent of SB in a smartphone :awe:

Next year if NV is to be believed :p
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I can't claim to fully understand that, but it is clear to me that Intel is taking power consumption very seriously. I really can't complain about that. Thanks IDC!

Just think, in 10~20 years, we might have the equivalent of SB in a smartphone :awe:

Next year if NV is to be believed :p

I don't think we have to wait 10yrs even considering there is some SB SKU (someone else posted it here, can't remember where though) that already operates at a mere 8.5 watts TDP on 32nm.

I wouldn't count on Intel though, not for this app-space. Nvidia or AMD will field the parts that get into smartphones before Intel does. Intel just seems so unfocused since Pentium4 that it is not funny.

They are focused on mainstream cpu, that is for sure and that is good. But elsewhere, be it Itanium or Atom or Larrabee it just seems like they are "phoning it in" lately.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Really . I hope you enjoy your llano . Itanic Atom Larrabee are all phone in . My lord I can't see any usefull benefits on working on said projects . I think before we put AMD into smartphones lets see how the 5 watt brazo does in tablets . Thats alot of watts for a tablet . I see few design wins here . Beings all three products are in use that you alluded to its hard to see where someone in the industry can overlook the research and development . So many were greatful to see P4 die yet I look at SB and say gee wiz theres alot of P4 in here some Itanic as well and with haswell likelly larrebee elements will be added . Phone in ya right . So your publicly stateing that intels 32nm atom will not have 1 phone design win is that not so , or did I miss something?
 
Last edited:

podspi

Golden Member
Jan 11, 2011
1,982
102
106
I don't think we have to wait 10yrs even considering there is some SB SKU (someone else posted it here, can't remember where though) that already operates at a mere 8.5 watts TDP on 32nm.

I wouldn't count on Intel though, not for this app-space. Nvidia or AMD will field the parts that get into smartphones before Intel does. Intel just seems so unfocused since Pentium4 that it is not funny.

They are focused on mainstream cpu, that is for sure and that is good. But elsewhere, be it Itanium or Atom or Larrabee it just seems like they are "phoning it in" lately.


It's crazy because I think they were really forward-looking with Atom, and then they just DROPPED it. I think Intel's major problem (if it can be said to have a problem) is the same as Microsoft's: They're making TONS of money now, but trends suggest the gravy train ride is almost over (almost being in the next decade or so). Internally, there is a battle between long-run sustainability of profits, and short-run "protect the cash cow".


Someone once said to me (after getting a gen1 Iphone) "I don't even need a computer anymore!". I would expect that smartphone CPU ASPs are much lower than desktop or mobile laptop ASPs.

Intel is stuck between enabling this huge shift and becoming irrelevant. I think they'll come through eventually, but I'm not surprised there is at least some resistance.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Really . I hope you enjoy your llano . Itanic Atom Larrabee are all phone in . My lord I can't see any usefull benefits on working on said projects . I think before we put AMD into smartphones lets see how the 5 watt brazo does in tablets . Thats alot of watts for a tablet . I see few design wins here . Beings all three products are in use that you alluded to its hard to see where someone in the industry can overlook the research and development . So many were greatful to see P4 die yet I look at SB and say gee wiz theres alot of P4 in here some Itanic as well and with haswell likelly larrebee elements will be added . Phone in ya right . So your publicly stateing that intels 32nm atom will not have 1 phone design win is that not so , or did I miss something?

Where's Larrabee?
We just got off the phone with Nick Knupffer of Intel, who confirmed something that has long been speculated upon: the fate of Larrabee. As of today, the first Larrabee chip’s retail release has been canceled. This means that Intel will not be releasing a Larrabee video card or a Larrabee HPC/GPGPU compute part.

http://www.anandtech.com/show/3592

Whats up with Itanium since Oracle outed the questionable roadmap?
Intel Corp. and Hewlett Packard may postpone the public update of the Itanium microprocessor roadmap because of certain reasons, a source with knowledge of the matter said. As Intel relocates Itanium development teams to Xeon central processing units in order to strengthen competitive positions of its primary server chip, HP seems to be in confusion regarding its own strategy.

Intel and HP planned to disclose post-Kittson (Itanium chip due in 2014 or 2015) in late April, 2011, according to unofficial sources. However, the event may be put on hold, some believe that because HP is unsure about its roadmap; other people suggested that Intel wanted to hold the information till its analyst day in May.

http://www.xbitlabs.com/news/cpu/di...oadmap_Update_amid_Uncertainties_Sources.html

And ATOM is doing how well in tablets, let alone smartphones?
Prototype Tablets with Intel Atom Processors Benchmarked, Prove Hot and Slow

By the looks of early testing, Intel desperately needs all the help it can get. A dual core Z6xx series atom chip running on the company's new Oak Trail chipset was shown off in a prototype design by Taiwan's Compal Electronics.

http://www.dailytech.com/article.aspx?newsid=21815

There is a reason, a good reason, why Intel has suddenly wanted to talk about 22nm and 14nm.

My point is that Intel has no excuse for not doing better in all these areas, but they just don't seem to want it as badly as their competition.

I could be wrong, I don't have to be right about this, but that's just how I see it. They executed well on Sandy Bridge. The wheels just kinda came off of a lot of parallel projects though.

Tukwilla was how late? So late that it was measured in years, and now things are more clouded in mystery than ever after Oracle outed HP and Intel and nothing substantial was delivered to counter their statements.

Atom is doing how well? So well Intel has to talk up their 14nm siblings to throw off attention from the 45nm and 32nm SKUs.

And Larrabee just completely imploded, leaving nothing in its wake but more powerpoint engineering and a few bullet points on Pat Gelsinger's resume.
 

Khato

Golden Member
Jul 15, 2001
1,251
321
136
And Larrabee just completely imploded, leaving nothing in its wake but more powerpoint engineering and a few bullet points on Pat Gelsinger's resume.

Haha, so appropriate! The push towards Larrabee graphics was pretty much entirely of his doing, and we all know how that turned out for him. I'd imagine that without his pushing towards a graphics usage, the original HPC Larrabee idea would already have a product out. Still, there will eventually be an HPC product, now that the aimless wandering of what really amounted to a massive research project has been mostly eliminated.

Itanium really amounts to the same thing as Larrabee... a research project turned into a product, but with some amount of actual success. Likely more if not for AMD and Microsoft locking us into x86 on the 64 bit transition. Certainly makes sense to move away from it though now that Xeon is reaching the same performance and there's more need for design resources in the low power area.

Finally there's atom, where it is indeed quite shameful that Intel basically just left the initial idea to sit there and make money without any real changes. I'd guess that their intent was to simply work on power and integration, which has certainly led to improvements on those fronts, but performance has been left stagnant. That said, linking to an article that's simply making use of an opportunity to bash Atom instead of actually explaining why the performance was so abysmal is somewhat misleading. It's understandable if the intent is to show that Intel really needs to devote resources to doing what Google won't - make an x86 port of Android that actually performs like it should. But the performance is purely a software issue and will certainly be resolved before actual production.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Where's Larrabee?


Whats up with Itanium since Oracle outed the questionable roadmap?


And ATOM is doing how well in tablets, let alone smartphones?


There is a reason, a good reason, why Intel has suddenly wanted to talk about 22nm and 14nm.

My point is that Intel has no excuse for not doing better in all these areas, but they just don't seem to want it as badly as their competition.

I could be wrong, I don't have to be right about this, but that's just how I see it. They executed well on Sandy Bridge. The wheels just kinda came off of a lot of parallel projects though.

Tukwilla was how late? So late that it was measured in years, and now things are more clouded in mystery than ever after Oracle outed HP and Intel and nothing substantial was delivered to counter their statements.

Atom is doing how well? So well Intel has to talk up their 14nm siblings to throw off attention from the 45nm and 32nm SKUs.

And Larrabee just completely imploded, leaving nothing in its wake but more powerpoint engineering and a few bullet points on Pat Gelsinger's resume.

Larrabee is in the hands of developers . Its called nights ferry / Knights corner is coming to Hpc @22nm . You said AMD will be in Phones befor intel . Intels 32 is suppose to go into phones I suspect that they at least 1 design win . Whats AMD puting into a phone . Not brazo. Actually AT said Atoms are enjoying steller sales recently. Oak trail tablets are to be released shortly . What tablet is using the 5watt AMD chip?

Even tho knights corner may or may not be for graphics it will be on HPC to battle NV.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I read that heresay article when it was released . So if a 3 watt part it hot hows the 5 watt part going to fare. Intel has over 30 tablet design wins howmany does AMD have with a HOT 5 watt chip You have seen the NEW ultra notebooks Intel has a plan in place to cover everthing down to the Atom class processor . Oak trail is a 45nm part . 32 is coming up right behind it . So I agree intel not doing much until 22nm Atom . But in the mean time they will sell millions of the craptastic atoms that are available on 45 and 32nm when released .
 
Last edited:

bridito

Senior member
Jun 2, 2011
350
0
0
I'm totally confused. I just read that X79 is Patsburg. In that case which is the chipset for the SB-E Xeons? And any rumors if that will be issued at the same time as the enthusiast SB-Es?
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Intel is failing with atom because they do not want it to succeed. Look at the SU7300. It is what, 4 years old? It is powerful and uses just 10 watts. SU7300 + ion is perfect for a tablet. They can surely do better than this with CULV SB and have a fairly modern gpu all for only about 8 watts max. The problem is they wont take less than 65% margins, which means $250 for the chip, which means $800 for whatever device it goes into. There's no market for that. But if they sold it for $100? Hmm... high volume might get systems selling for under $500. Then we get an entire ecosystem in place, which creates higher demand for windows tablets. So Intel takes a hit on profits but they turn around and crush ARM. Intel can do this, but will they?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well actually It will likely turn into a very good add on board for HPC . Intel clearly stated If Intel doesn't introduce Scatter gather and other adds on the GPU will simply walk away from Intels Gpus intel is working hard to make sure both the cpu/IPG take advantage of all components on its soc cpus. So intel isn't trying to fool anyone and are fully aware of their situation . Good thing software laggs so far behind. As far as I can see in the phone tablet area . Imagination tech . Seems to be leading the way. But thats not an american company so its GPU is ignored by the forums. But Arm because NV is using is talked about extensively.

The stinking joke isn't over with yet.

http://www.geek.com/articles/chips/intel-unveils-knights-corner-50-core-server-chip-2010061/
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Idontcare, as sm625 brought up you can explain most of Intel's seemingly choppy behavior based on margins. Look how they kept pushing their hot and hungry chipset with the original atoms, I'm positive that was to get the most out of their old process. It does seem like Intel is now responding to Brazos and ARM by promising OEMs quicker movement of Atom to better process tech. Bottom line, their low-power use customers have to pull Intel kicking and screaming on anything that will eat into profits.

Heck, it wouldn't be surprising to discover that a lot of large orders hinted at for Llano and Brazos have a secondary purpose of getting Intel to budge some on price and product attributes.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Intel was never interested in building netbooks. They only ended up doing it because at some point they realized that Atom was a long, long, long way away from being competitive with ARM in actual mobile devices. That chipset basically made my Dell Mini a worthless brick. No media capabilities whatsoever and ridiculous power consumption leading to 2-3 hour battery life.

I can't ever think about Intel chipsets without my blood pressure going up. Don't even get me started on all the problems I had with my Inspiron 6000 because Intel decided to only half-implement Direct X 9. When Vista came out, the system became a pile of trash. Now it is just a glorified NAS running Windows Server 2003. I tried to turn it into a wireless router, but... don't get me started on Intel's wireless cards. That 2915 was such a pain...

My nearly-identically specced Inspiron 9000 with an Nvidia 7800GTX is still running over 5 years later, running windows 7 tolerably well.

sigh...

It really takes a lot to make Broadcom come out looking good.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Atom is better now? I'm using an Asus 1015PN netbook with Atom N550 (1.5GHz dual core) and NV ION2. With Intel graphics, battery is 6+ hours (brightness turned down) of browsing/video playback. With ION2 on I can play 1080p x264 for about 3-4 hours, more if just outputting on HDMI. It does get hot though.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Your netbook weighs as much as a 13" Macbook Air with a 1.86GHz Core2Duo, and it gets significantly worse battery life (the batteries are approximately the same Wh rating).

Atom just sucks as a platform.
 

CPUarchitect

Senior member
Jun 7, 2011
223
0
0
Larrabee technology is likely to be merged into the CPU. The addition of AVX was the first step and it was clearly intended for long-term convergence: Fused multiply-add (FMA) instructions have already been specified, and it's also readily extendable to 1024-bit vectors.

With FMA slated for the Haswell architecture, this means that as early as 2013 a mainstream 8-core CPU could deliver 1 TFLOP of computing power. All they'd have to add to be able to make efficient use of it is gather/scatter instructions like on Larrabee. And to significantly lower the power consumption they can execute 1024-bit AVX instructions on the existing 256-bit execution units in four clock cycles.

The end result would be a CPU which can deliver IGP-level performance for legacy graphics, and would truly excel at future complex workloads. The possibilities of such a homogeneous architecture which combines high throughput with high serial performance are endless. GPUs are evolving in the same direction by becoming ever more programmable, but CPUs can get there faster with just a few additions. It's also more efficient to compute everything on a single chip.

Note that unlike Larrabee it doesn't have to compete against the competition's high-end GPUs. It would be an instant success in the HPC market, and by ditching the IGP we'd get extra room for CPU cores (Sandy Bridge could already have been a 6-core design). Gather/scatter support would also facilitate auto-vectorization for the majority of software, so it requires minimal effort from developers.

Basically, evolving AVX toward LRBni would be a zero-risk strategy for Intel to gain GPU and HPC market share from below. By the end of the decade you could be opting for a 64-core CPU instead of a discrete GPU...
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
CPUarch. That was a really good post. You said so efficiently what takes me a whole page to say . without causing confusion . well done . You think maybe intel will increase AVX 256 bit on IB to 512. I do . You do see haswell exactly as I see it 1024 bit with scatter gather . That what all of intel documents hint at and thier naming scheme
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Can anyone explain to me WHAT gather and scatter instructions are?! Are they supposed to make things multithreaded automagically?
 

CPUarchitect

Senior member
Jun 7, 2011
223
0
0
You think maybe intel will increase AVX 256 bit on IB to 512. I do.
No, Ivy Bridge absolutely won't have 512-bit AVX. The most recent AVX documentation added a specification for "post-32nm instructions" which are expected to be supported by Ivy Bridge. These mainly add support for 16-bit floating-point numbers. Since Ivy Bridge is mainly a process shrink, they won't add anything which requires architectural changes. Adding a few instructions is a relatively small change to the ALUs, while supporting 512-bit instructions affects lots of other things.

There's no official information yet on when FMA support will be added, so it is expected that it will be delayed till Haswell.

Widening the vectors and/or adding gather/scatter support can realistically only be expected to be added to the next major architecture, in 2015. It takes many years to do research and design a new CPU...
You do see haswell exactly as I see it 1024 bit with scatter gather . That what all of intel documents hint at and thier naming scheme
Could you point out these hints to me? Thanks.