• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel Broadwell-K & Skylake (non-K) desktop CPUs to launch in Q2-2015

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I don't either because wasn't Intel suppose to be competing with themselves? At least according to some. 😉

Of course they are competing with themselves. I have a small business that relies on 2 laptops and 5 desktop rigs.

All Intel, except one AMD that I bought as a testbed and determined the price/performance wasn't there.

And I won't be replacing any of the Intel-based desktops or laptops until Intel produces a compelling performance/dollar upgrade narrative for my business.

That is Intel's competition, the chips they sold 2+ yrs ago won't need to be replaced until the chips they are currently selling make a compelling value-add argument for the upgrade segment of the industry (which is by far the largest portion of the TAM).

That said, it still doesn't explain why Intel would invest so heavily in developing FIVR only to pull it a couple years later. They only kill features for cost reduction or lack of revenue enablement.

So someone must have re-run some numbers and came to the conclusion that FIVR is not as critically enabling as once projected, so now it is falling into the "cost reduction opportunity" bin.

None of that is really surprising, standard business management stuff going on there, I'm more curious to understand "what changed" because it is probably an interesting story. I'll have to ask around.
 
Intel eliminates features to reduce costs anytime those features are no longer internally forecast as being "revenue growth enablers".

Why do our Intel chips have IGP? It is currently forecast to continue to be a "revenue growth enabler". Should that internal forecast turn south then the feature will be cut.
Yep. People here complain about the IGPs, and for good reason, but Intel's got a good reason too. They've got a steadily increasing graphics market share, as OEMs are finding less reason to include discrete GPUs on notebooks and probably desktops too.
The question here is why was Intel convinced in the first place that FIVR was going to enable higher revenue growth, and what happened to change that expectation?

I personally have no insight into why Intel would be getting rid of FIVR, I thought the idea was solid and would naturally improve over time (like the IMC).
Exactly. I was fairly certain that FIVR was a huge part of why Haswell's battery life is so good. I knew it wasn't perfect, but I'd expected that it'd only get improved as Intel figured out how to integrate more of the power circuitry on die.

I'd heard it wasn't living up to Intel's expectations, previous to this, but I don't remember from where.

Really, I'm kind of stunned. Surely OEMs wanted it -- board thickness and area could be reduced, costs were lower, and I think the battery life savings were obvious.

The only real downside was the effect it had on Haswell overclocks. With it sitting smack-dab between every CPU core, CPU and GPU, and the CPU and System Agent, there's no doubt it got nice and toasty and negatively affected the adjacent cores.

But the die real estate costs were minimal for the improvement it (supposedly) gave.
I don't either because wasn't Intel suppose to be competing with themselves? At least according to some. 😉
I'm not seeing the relevance of this.
 
The only real downside was the effect it had on Haswell overclocks. With it sitting smack-dab between every CPU core, CPU and GPU, and the CPU and System Agent, there's no doubt it got nice and toasty and negatively affected the adjacent cores.

I'm thinking that clocks were an issue going into 14nm - getting down the heat output near critical systems maybe have improved yields at a given clock. Since many people look at specs and would likely respond more favorably to higher clocks, it's not enabling business in desktop/AIO since consumers don't care much about power in those areas (they will in notebooks). Just going to 14nm will bring down power consumption and will give Skylake CPUs a huge advancement in batter life over 22nm with notebooks/netbooks.

Anyway, this is wild guess based on stuff I've read like you have posted. Like IDC said, the real story will likely be interesting and possibly done for some completely unexpected reason.
 
Couple more images:
Richtek-motherboard-solutions-04.JPG

Ooh, are those TDPs?
Richtek-motherboard-solutions-05.JPG


Looks like Skylake goes up to 95W? Seems to have core configs too!

Desktop 4C+GT2 = 95W, there's a 4C+GT4e part too 🙂
 
Last edited:
Couple more images:
Richtek-motherboard-solutions-04.JPG

Ooh, are those TDPs?
Richtek-motherboard-solutions-05.JPG


Looks like Skylake goes up to 95W? Seems to have core configs too!

Desktop 4C+GT2 = 95W, there's a 4C+GT4e part too 🙂

95W!? These things better ship at 4.5GHz stock. That is ridiculous!
 
More than that actually:
-3mm reduction in thickness
-1/3.5th size of the VR section on the board
-30% increase in iGPU performance
-Better response with sleep states

Could mean that Skylake is going to be targeted towards high power laptops and high performance desktops/server. Is this the true successor to Sandy Bridge?
 
More than that actually:
-3mm reduction in thickness
-1/3.5th size of the VR section on the board
-30% increase in iGPU performance
-Better response with sleep states

Maybe the cost/benefit rationale wasn't great enough to continue. Those numbers are all very solid, so I'm not sure why they would backtrack unless they either came up with some other way of regulation. With 10nm, FIVR could be less of a necessity, but still I'm perplexed in the decision to ax this implementation. Especially considering iGPU perf.
 
Of course they are competing with themselves. I have a small business that relies on 2 laptops and 5 desktop rigs.

All Intel, except one AMD that I bought as a testbed and determined the price/performance wasn't there.

And I won't be replacing any of the Intel-based desktops or laptops until Intel produces a compelling performance/dollar upgrade narrative for my business.

That is Intel's competition, the chips they sold 2+ yrs ago won't need to be replaced until the chips they are currently selling make a compelling value-add argument for the upgrade segment of the industry (which is by far the largest portion of the TAM).

That said, it still doesn't explain why Intel would invest so heavily in developing FIVR only to pull it a couple years later. They only kill features for cost reduction or lack of revenue enablement.

So someone must have re-run some numbers and came to the conclusion that FIVR is not as critically enabling as once projected, so now it is falling into the "cost reduction opportunity" bin.

None of that is really surprising, standard business management stuff going on there, I'm more curious to understand "what changed" because it is probably an interesting story. I'll have to ask around.

IDC,

Here is a post from InvestorsHub from somebody with knowledge of the Haswell project to give you some insight.

There are some kernels of truth to this IVR scare, but basing investment advice on such incomplete information is downright criminal. So here's a little inside info about this integrated VR stuff. I'm not involved with IVR technology, but I do have some insight into the Haswell project.

First of all, the primary benefit of an integrated VR is cost savings. It's integrating yet more things that used to be on a motherboard into the chip itself, and that means a lower BOM for OEMs. Not that there aren't other benefits as well, but those other benefits by themselves probably wouldn't be enough to take on the technical challenges (and they are numerous).

Secondly, there were indeed some problems identified with Haswell's integrated VRs when silicon came back over a year ago. You know who found those problems? The team in Israel. I've been told that the folks on that team do not like some aspects of the IVRs on HSW (I don't know if it's the implementation, or the whole concept, or whether they doubt the ROI, or what). One of the benefits of having two world class design teams at your disposal is that they are always pushing each other, always questioning each other, not to mention the friendly competition that always goes on.

Well, the folks in IDC wrote a funky power virus test that showed some holes in the IVR circuit performance on Haswell silicon. This did indeed cause an all-hands-on-deck response, and various circuit improvements that were already done on the Broadwell IVRs were pulled back into the Haswell design, along with some other potential fixes that were identified. I won't go into details on the problem, the impact, or the solution, but continuous improvements were put into each successive stepping, and my third-hand knowledge is that the problem is 99% contained. Meanwhile, Haswell's PRQ schedule marches on unabated.

It's worth noting that in the grand scheme of processor design, there really wasn't anything new here. This was by no means the first all-hands-on-deck emergency on the road to releasing a brand new processor, and it won't be the last. And it's also not the first time that a major problem was 99% solved instead of 100%. I think the reason this one got into the public domain is because it's likely that the next tock project that will be designed by IDC, Skylake, will probably not have IVRs (I don't actually know that, but it seems like a safe bet to me if they don't believe in the technology for whatever reason). Since this is a feature that directly impacts board and system manufacturers, I'm sure that many folks outside Intel asked "Why?". And if the person answering that question has some bias against the technology, there could be some bad-mouthing or doom-saying that slipped into the answer (or maybe that needs to be part of the answer to cushion the blow of removing a feature that is helpful to system makers). If you worry that maybe things like cost or power could get out of control with such a decision, remember that those two vectors are still priority #1 with Intel's processors, and you can bet they'll be pulling out all the stops to put out a killer product. Those folks over in IDC are no dummies, and if their ROI analysis tells them a project doesn't need IVRs to be successfull, I trust them. Again, I have no idea what is or isn't being included in Skylake and other future products, this is just my logical conjecture.

So that's what I know, along with my conjecture. It really is amazing how we've all become so jaded to the enormous complexities of building better and better microprocessors, as if new designs and products come out every year all on their own, with no hefty engineering challenges cropping up along the way. Most on this board know better, and know that it doesn't happen that way at all. The sausage-making can get pretty ugly sometimes, and nothing is ever easy. Intel is where it's at today because it responds to those challenges better than anybody, IMO. Have a little faith, folks...

http://investorshub.advfn.com/boards/read_msg.aspx?message_id=86312251
 

Interesting commentary, thanks!

I liked this part, as it is a good sign
One of the benefits of having two world class design teams at your disposal is that they are always pushing each other, always questioning each other, not to mention the friendly competition that always goes on.

When I worked on embedded firmware for enterprise network switches - we had at least a 1/2 dozen other teams doing similar work on other products, and we often used code developed by another group. We'd snipe at each other when something was poorly coded or used too much memory, etc. We really kept each other sharp!

It's really good to hear that this kind of dynamic is occurring at Intel - it definitely shows that internal competition at Intel will help it move forward the best technology and practices and roll back the ones that ultimately cause more problems than they solve.
 
This still does not answer IDC's question - "I'm more curious to understand "what changed" because it is probably an interesting story"
I would like to know as well
 
Yep. People here complain about the IGPs, and for good reason, but Intel's got a good reason too. They've got a steadily increasing graphics market share, as OEMs are finding less reason to include discrete GPUs on notebooks and probably desktops too.
Exactly.

Intel increase its GPU market share only because it is increasing its CPU market share over AMD not because less dGPUs are sold.
Even before the APUs and the Intel Integrated GPUs on Dies, Intel had ~55% of Total GPU market share simple because they had Integrated GPUs in Chipsets. Both in Desktop and Laptops.

You can clearly see from the graph bellow that Intel gets higher GPU market share at the time when Laptops start to get higher shipments than Desktop around 2007-2008. That is way before Intel launched SandyBridge in 2011.

http://www.techpowerup.com/forums/t...ference-transcript.180345/page-2#post-2849901

Overall%20graphic%20market%20share%202002-2012.jpg



No APUs or Intel On Die Integrated GPUs and yet the iGPU market share was higher than Discrete even from 2002-2003.
18091-JPR-backpages-2.jpg



Even in Desktop, iGPUs on Chipsets had higher market share than Discrete GPUs before APUs where made.
GPU_mkt.png


Desktop and Laptop Discrete GPU shipments doesn't decrease. It is the Integrated GPUs (On die or embedded/APUs) of both Intel and AMD that increase and that makes Intels total GPU market share to increase.
Simple, 99% of Intel's non Server SKUs have iGPUs the last 2-3 years, from ATOM/Celerons to High-End Core i7(not Socket 2011).

integrated-graphics-report-chart.jpg


Edit: Just to add, Intel will ship 30-40M ATOMs for Tablets in 2014 ??? Those ATOMs have iGPUs, those iGPUs will be added to the overall GPU Market Share.
Outstanding, Intel will increase its GPU Market Share in 2014. :whiste:
 
Last edited:
@AtenRa - so what you are telling is dGPU market has stagnated from 2005 but iGPU market has grown really well over last 7 years. not a bad strategy by intel i would guess
 
Intel increase its GPU market share only because it is increasing its CPU market share over AMD not because less dGPUs are sold.
Even before the APUs and the Intel Integrated GPUs on Dies, Intel had ~55% of Total GPU market share simple because they had Integrated GPUs in Chipsets. Both in Desktop and Laptops.

You can clearly see from the graph bellow that Intel gets higher GPU market share at the time when Laptops start to get higher shipments than Desktop around 2007-2008. That is way before Intel launched SandyBridge in 2011.
I don't see how this conflicts in any way with what I've said.
 
I don't see how this conflicts in any way with what I've said.

I thought you imply that Intel is increasing its GPU market share because AMD and NVIDIA are selling less Discrete GPUs. You said that OEMs are finding harder to use dGPUs on Laptops and Desktops. No ??
 
Basically no desktop CPUs for Broad well except the K series . IIRC isn't that what was speculated late last year, BGA only mobile CPUs and K series. I guess if this was true, it will continue to happen and more desktop CPUs will be BGA.
 
Intel increase its GPU market share only because it is increasing its CPU market share over AMD not because less dGPUs are sold.
Even before the APUs and the Intel Integrated GPUs on Dies, Intel had ~55% of Total GPU market share simple because they had Integrated GPUs in Chipsets. Both in Desktop and Laptops.

You can clearly see from the graph bellow that Intel gets higher GPU market share at the time when Laptops start to get higher shipments than Desktop around 2007-2008. That is way before Intel launched SandyBridge in 2011.

http://www.techpowerup.com/forums/t...ference-transcript.180345/page-2#post-2849901

Overall%20graphic%20market%20share%202002-2012.jpg



No APUs or Intel On Die Integrated GPUs and yet the iGPU market share was higher than Discrete even from 2002-2003.
18091-JPR-backpages-2.jpg



Even in Desktop, iGPUs on Chipsets had higher market share than Discrete GPUs before APUs where made.
GPU_mkt.png


Desktop and Laptop Discrete GPU shipments doesn't decrease. It is the Integrated GPUs (On die or embedded/APUs) of both Intel and AMD that increase and that makes Intels total GPU market share to increase.
Simple, 99% of Intel's non Server SKUs have iGPUs the last 2-3 years, from ATOM/Celerons to High-End Core i7(not Socket 2011).

integrated-graphics-report-chart.jpg


Edit: Just to add, Intel will ship 30-40M ATOMs for Tablets in 2014 ??? Those ATOMs have iGPUs, those iGPUs will be added to the overall GPU Market Share.
Outstanding, Intel will increase its GPU Market Share in 2014. :whiste:

@AtenRa - so what you are telling is dGPU market has stagnated from 2005 but iGPU market has grown really well over last 7 years. not a bad strategy by intel i would guess

The real question, perhaps the only one that matters, is what is the revenue share?

Gaining unit share is nice if there is revenue tied to those units. However, if you are gaining unit market share because you are giving away iGPU's essentially for free then it is hardly a cause for joy amongst shareholders or employers alike.
 
This must be an older road map, Broadwell-U has been pushed to Q1 2015.

Broadwell-U GT3 has been pushed to Q1 (according to one VR-Zone roadmap), GT2 could still be launched in late 2014 and I expect quite a few design wins @ CES. Also, this is the third leak in a month mentioning desktop Skylake in 2015, probably coming from different sources. I think the Skylake @ H2/2016 or 2017 crowd should just give up. 😉
 
Back
Top