• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel Broadwell BDW-H delayed May 2015

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I think you confuse the past and now. Desktop was a majority market back then. Today its the most insignificant segment in priority terms there is. And a lot people seems to be in denial of this and blaming/excuse everything else. The same goes for dGPUs for that matter.

It's still a big market (at least for Intel) in terms of $$s and profits, but it is probably in Intel's interest to just move enthusiast to the HEDT platform and make even bigger profits for themselves and their partners.

Intel also still relies on a process of developing the uArch on desktop x86 CPUs and building their server CPUs off that basic implementation. I wonder if that will change over the next 5-10 years.
 
If they launch Skylake in late Q3/Q4 2015 then it's an year after Broadwell-Y/U and close to 1.5 year after Haswell Refresh. LGA2011 chips are tipically behind mainstream chips by 1 generation, perhaps the same will happen with unlocked K chips from now on. Nothing stops Intel from replacing non-K Haswell Refresh based SKUs with non-K Skylake desktop chips in H2/2015. Broadwell-K could very well be Devil's Canyon direct successor if it's really launching in Q2/2015. If it runs @ over 4GHz and locked Skylake tops out @ 3.5-3.6GHz then those could coexist and Intel could launch Skylake-K a few months later in 2016. We'll have to wait and see.
 
Last edited:
So you mean that since Broadwell-H/K will be released in summer of 2015, it gives us desktop Skylake two years after that, i.e. in summer of 2017?

This thread is a rumor, not a fact. And I actually meant 2 years for 22nm, so Intel shouldn't have any economical concerns for the 14nm launch, which I think they don't because BK earlier said they'd release Broadwell as soon as they could, but yield issues can always happen. 1Y3Q still shouldn't be a problem for Intel, so I hope 10nm doesn't get delayed because of 14nm's delay.

BTW, I hope you meant Cannonlake instead of Skylake.
 
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.
 
Witeken- don't confuse per wafer costs with total costs. Intel also has to pay for those staggeringly expensive fabs. If it can build fewer 14nm fabs by milking the 22nm process, it saves a hell of a lot of capital.
 
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.

The big question would be where to buy them from. Because almost, if not all tech/foundry companies would go bankrupt with such a setup.
 
This thread is a rumor, not a fact. And I actually meant 2 years for 22nm, so Intel shouldn't have any economical concerns for the 14nm launch, which I think they don't because BK earlier said they'd release Broadwell as soon as they could, but yield issues can always happen. 1Y3Q still shouldn't be a problem for Intel, so I hope 10nm doesn't get delayed because of 14nm's delay.

BTW, I hope you meant Cannonlake instead of Skylake.

You miss my point. If they spend $X billion on R&D for a CPU generation, they cannot have a sales window only allowing $X/2 billion in profit or they'll make a loss.

And even if they'd make $2X billion in profit during that sales window, they'd be making higher profit per year if they'd increase the sales window so they could earn $4X billion on the $X billion invested R&D money. Now that only works when there's a lack of competition, which is what we have today within the desktop CPU arena.

There's of course still a limit to how far they can extend the sales window though, since at some point the customers will require further improvements to buy a new product.
 
It's still a big market (at least for Intel) in terms of $$s and profits, but it is probably in Intel's interest to just move enthusiast to the HEDT platform and make even bigger profits for themselves and their partners.

Intel also still relies on a process of developing the uArch on desktop x86 CPUs and building their server CPUs off that basic implementation. I wonder if that will change over the next 5-10 years.

This is the scariest thing that's almost certain to happen. I don't want to have to spend $600 minimum just on a CPU and motherboard for gaming...
 
Can you imagine moving into a 10-year replacement cycle for Desktop PCs? Basically, only replace them when they fail, instead of when there is something new / faster to purchase.

That's pretty much what consumers are doing these days. Keep their PC until it breaks while buying tablets and smartphones. (Most?) Corporate users are still sticking to the 3-year cycle; if they went to 5-7 years Intel would have serious problems.
 
This is the scariest thing that's almost certain to happen. I don't want to have to spend $600 minimum just on a CPU and motherboard for gaming...

Did we have a price inflation lately? X79 board for 190$ and CPU for 325$.

But again, the FUD in this thread is amazing. All desktop is gone cept HEDT right?
 
Yet another post insinuating that delays in desktop skus are due to lack of competition instead of lack of market demand. 🙄

Companies exist for profit. If consumers want to buy less of a product then expect less of that product to be made.

That sheet shows clear priority. Go after tablets and thin laptops with broadwell first. Desktop resources move to the -Y -U areas to make that happen sooner which pushes out the desktop versions. Not sure what this has to do with AMD who is exiting even faster from the mid-range desktop than even Intel is.
 
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\
 
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\

If you're sitting on a 2500K or earlier it's still interesting to know what's will become available in the next 2+ years, if you intend to upgrade in that time frame. And yes, AMD could have something worth considering by then, we'll see...
 
It annoys me that people usually link to wccftech. They are so dirty, it's a stolen vr-zone slide without link to the source. Credits should go to vr-zone.
 
Boy, Devil's Canyon had better be good; otherwise I...

Wait, what will I do? Wait two years to see if AMD's new architecture is any good? Intel's really the only good high-end processor around. :\

Is it really a bad thing if you don't have to upgrade to play the latest titles? Even like an overclocked 920 should be good for the foreseeable future.
 
It annoys me that people usually link to wccftech. They are so dirty, it's a stolen vr-zone slide without link to the source. Credits should go to vr-zone.

They are usually a preferred alternative to Google translate of the chinese VR-Zone, I guess that's why. But yeah, I agree with you in principle. And I think the link to the original source should always be added to the post.
 
@Techhog - I am not sure I understand you. gaming consoles cost $400 today just to eke out 40fps @1080p on specific games. how do you expect to ur DT to be future proof for the next 5yrs but you dont want to spend $800?
 
@Techhog - I am not sure I understand you. gaming consoles cost $400 today just to eke out 40fps @1080p on specific games. how do you expect to ur DT to be future proof for the next 5yrs but you dont want to spend $800?

I don't believe in "future proofing"
 
Is it really a bad thing if you don't have to upgrade to play the latest titles? Even like an overclocked 920 should be good for the foreseeable future.
Honestly, CPUs are becoming even less necessary to upgrade for gaming, with all the driver overheard reductions coming up through various APIs.
 
You miss my point. If they spend $X billion on R&D for a CPU generation, they cannot have a sales window only allowing $X/2 billion in profit or they'll make a loss.

And even if they'd make $2X billion in profit during that sales window, they'd be making higher profit per year if they'd increase the sales window so they could earn $4X billion on the $X billion invested R&D money. Now that only works when there's a lack of competition, which is what we have today within the desktop CPU arena.

There's of course still a limit to how far they can extend the sales window though, since at some point the customers will require further improvements to buy a new product.

That's economic nonsense. As soon as 14nm yields are high enough, Intel's priority will be to switch to the new process to take advantage of the increased margins per unit. $X (22nm) is already sunk and $Y (14nm) mostly spent, so the only way to make more short-term profit on 22nm is to halt development on 14nm. That's not going to happen.
 
I don't believe in "future proofing"

But there's just no need to upgrade. I still consider my 2600K pretty future proof and don't intend to upgrade anytime soon, games don't require it. And the console's tablet-cpu certainly isn't going to push the envelope either. If DX12 can deliver the same improvements as Mantle has shown, CPU's will become even more irrelevant when it comes to gaming.
I don't see how any current i7 (K-model) wouldn't be "future proof" for at least 5 years. At that point you'd probably want to upgrade because the platform would be outdated rather than the performance.
 
Yes, that's interesting! And what's even more interesting is that if you look at the release schedule, all the low power & low frequency models are released first. Then the other models are released at later times, in order of increasing power & frequency.

And the time difference between the first released product (Y-model) and the last model (H/K-models) is ~9 months (!).

You could argue that this is because mobile gets priority nowadays, if production capacity is limited. But this is an unusually large time difference between release of models, something we're not used to seeing. And all the top end mobile/laptop chips will be released last, which is also kind of strange, if mobile/laptop should be prioritized.

So I'm instead suspecting that this could mean that Intel are having problems reaching high frequencies at 14 nm, and are hoping to perfect the process to allow for higher frequencies as time passes. :hmm:
 
Last edited:
Back
Top