Intel Broadwell BDW-H delayed May 2015

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
I think you confuse the past and now. Desktop was a majority market back then. Today its the most insignificant segment in priority terms there is. And a lot people seems to be in denial of this and blaming/excuse everything else. The same goes for dGPUs for that matter.

It's still a big market (at least for Intel) in terms of $$s and profits, but it is probably in Intel's interest to just move enthusiast to the HEDT platform and make even bigger profits for themselves and their partners.

Intel also still relies on a process of developing the uArch on desktop x86 CPUs and building their server CPUs off that basic implementation. I wonder if that will change over the next 5-10 years.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
If they launch Skylake in late Q3/Q4 2015 then it's an year after Broadwell-Y/U and close to 1.5 year after Haswell Refresh. LGA2011 chips are tipically behind mainstream chips by 1 generation, perhaps the same will happen with unlocked K chips from now on. Nothing stops Intel from replacing non-K Haswell Refresh based SKUs with non-K Skylake desktop chips in H2/2015. Broadwell-K could very well be Devil's Canyon direct successor if it's really launching in Q2/2015. If it runs @ over 4GHz and locked Skylake tops out @ 3.5-3.6GHz then those could coexist and Intel could launch Skylake-K a few months later in 2016. We'll have to wait and see.
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
So you mean that since Broadwell-H/K will be released in summer of 2015, it gives us desktop Skylake two years after that, i.e. in summer of 2017?

This thread is a rumor, not a fact. And I actually meant 2 years for 22nm, so Intel shouldn't have any economical concerns for the 14nm launch, which I think they don't because BK earlier said they'd release Broadwell as soon as they could, but yield issues can always happen. 1Y3Q still shouldn't be a problem for Intel, so I hope 10nm doesn't get delayed because of 14nm's delay.

BTW, I hope you meant Cannonlake instead of Skylake.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.
 

NTMBK

Lifer
Nov 14, 2011
10,461
5,845
136
Witeken- don't confuse per wafer costs with total costs. Intel also has to pay for those staggeringly expensive fabs. If it can build fewer 14nm fabs by milking the 22nm process, it saves a hell of a lot of capital.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.

The big question would be where to buy them from. Because almost, if not all tech/foundry companies would go bankrupt with such a setup.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,248
598
126
This thread is a rumor, not a fact. And I actually meant 2 years for 22nm, so Intel shouldn't have any economical concerns for the 14nm launch, which I think they don't because BK earlier said they'd release Broadwell as soon as they could, but yield issues can always happen. 1Y3Q still shouldn't be a problem for Intel, so I hope 10nm doesn't get delayed because of 14nm's delay.

BTW, I hope you meant Cannonlake instead of Skylake.

You miss my point. If they spend $X billion on R&D for a CPU generation, they cannot have a sales window only allowing $X/2 billion in profit or they'll make a loss.

And even if they'd make $2X billion in profit during that sales window, they'd be making higher profit per year if they'd increase the sales window so they could earn $4X billion on the $X billion invested R&D money. Now that only works when there's a lack of competition, which is what we have today within the desktop CPU arena.

There's of course still a limit to how far they can extend the sales window though, since at some point the customers will require further improvements to buy a new product.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
It's still a big market (at least for Intel) in terms of $$s and profits, but it is probably in Intel's interest to just move enthusiast to the HEDT platform and make even bigger profits for themselves and their partners.

Intel also still relies on a process of developing the uArch on desktop x86 CPUs and building their server CPUs off that basic implementation. I wonder if that will change over the next 5-10 years.

This is the scariest thing that's almost certain to happen. I don't want to have to spend $600 minimum just on a CPU and motherboard for gaming...
 

jpiniero

Lifer
Oct 1, 2010
16,866
7,307
136
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.

That's pretty much what consumers are doing these days. Keep their PC until it breaks while buying tablets and smartphones. (Most?) Corporate users are still sticking to the 3-year cycle; if they went to 5-7 years Intel would have serious problems.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
This is the scariest thing that's almost certain to happen. I don't want to have to spend $600 minimum just on a CPU and motherboard for gaming...

Did we have a price inflation lately? X79 board for 190$ and CPU for 325$.

But again, the FUD in this thread is amazing. All desktop is gone cept HEDT right?
 

kimmel

Senior member
Mar 28, 2013
248
0
41
Yet another post insinuating that delays in desktop skus are due to lack of competition instead of lack of market demand. :rolleyes:

Companies exist for profit. If consumers want to buy less of a product then expect less of that product to be made.

That sheet shows clear priority. Go after tablets and thin laptops with broadwell first. Desktop resources move to the -Y -U areas to make that happen sooner which pushes out the desktop versions. Not sure what this has to do with AMD who is exiting even faster from the mid-range desktop than even Intel is.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,703
4,661
75
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,248
598
126
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\

If you're sitting on a 2500K or earlier it's still interesting to know what's will become available in the next 2+ years, if you intend to upgrade in that time frame. And yes, AMD could have something worth considering by then, we'll see...
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Did we have a price inflation lately? X79 board for 190$ and CPU for 325$.

But again, the FUD in this thread is amazing. All desktop is gone cept HEDT right?

No need to get nasty over a simple mistake. $500 is still a crapton anyway.
 

mikk

Diamond Member
May 15, 2012
4,311
2,395
136
It annoys me that people usually link to wccftech. They are so dirty, it's a stolen vr-zone slide without link to the source. Credits should go to vr-zone.
 

jpiniero

Lifer
Oct 1, 2010
16,866
7,307
136
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\

Is it really a bad thing if you don't have to upgrade to play the latest titles? Even like an overclocked 920 should be good for the foreseeable future.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,248
598
126
It annoys me that people usually link to wccftech. They are so dirty, it's a stolen vr-zone slide without link to the source. Credits should go to vr-zone.

They are usually a preferred alternative to Google translate of the chinese VR-Zone, I guess that's why. But yeah, I agree with you in principle. And I think the link to the original source should always be added to the post.
 

bullzz

Senior member
Jul 12, 2013
405
23
81
@Techhog - I am not sure I understand you. gaming consoles cost $400 today just to eke out 40fps @1080p on specific games. how do you expect to ur DT to be future proof for the next 5yrs but you dont want to spend $800?
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
@Techhog - I am not sure I understand you. gaming consoles cost $400 today just to eke out 40fps @1080p on specific games. how do you expect to ur DT to be future proof for the next 5yrs but you dont want to spend $800?

I don't believe in "future proofing"
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Is it really a bad thing if you don't have to upgrade to play the latest titles? Even like an overclocked 920 should be good for the foreseeable future.
Honestly, CPUs are becoming even less necessary to upgrade for gaming, with all the driver overheard reductions coming up through various APIs.
 

jj109

Senior member
Dec 17, 2013
391
59
91
You miss my point. If they spend $X billion on R&D for a CPU generation, they cannot have a sales window only allowing $X/2 billion in profit or they'll make a loss.

And even if they'd make $2X billion in profit during that sales window, they'd be making higher profit per year if they'd increase the sales window so they could earn $4X billion on the $X billion invested R&D money. Now that only works when there's a lack of competition, which is what we have today within the desktop CPU arena.

There's of course still a limit to how far they can extend the sales window though, since at some point the customers will require further improvements to buy a new product.

That's economic nonsense. As soon as 14nm yields are high enough, Intel's priority will be to switch to the new process to take advantage of the increased margins per unit. $X (22nm) is already sunk and $Y (14nm) mostly spent, so the only way to make more short-term profit on 22nm is to halt development on 14nm. That's not going to happen.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
I don't believe in "future proofing"

But there's just no need to upgrade. I still consider my 2600K pretty future proof and don't intend to upgrade anytime soon, games don't require it. And the console's tablet-cpu certainly isn't going to push the envelope either. If DX12 can deliver the same improvements as Mantle has shown, CPU's will become even more irrelevant when it comes to gaming.
I don't see how any current i7 (K-model) wouldn't be "future proof" for at least 5 years. At that point you'd probably want to upgrade because the platform would be outdated rather than the performance.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,248
598
126
Yes, that's interesting! And what's even more interesting is that if you look at the release schedule, all the low power & low frequency models are released first. Then the other models are released at later times, in order of increasing power & frequency.

And the time difference between the first released product (Y-model) and the last model (H/K-models) is ~9 months (!).

You could argue that this is because mobile gets priority nowadays, if production capacity is limited. But this is an unusually large time difference between release of models, something we're not used to seeing. And all the top end mobile/laptop chips will be released last, which is also kind of strange, if mobile/laptop should be prioritized.

So I'm instead suspecting that this could mean that Intel are having problems reaching high frequencies at 14 nm, and are hoping to perfect the process to allow for higher frequencies as time passes. :hmm:
 
Last edited: