The DIY market really isn't that big. The IGP is pretty necessary for OEMs. There's a reason that Intel is still making a ton of money in desktops even now.So while we all know 10nm really shook things up and is actively making the situation worse for them, it's not the only reason they are struggling, had they increase core count when they made Broadwell instead of expanding gpu cores or creating a desktop variation of the server cores, it would have barely cost Intel anything and they wouldn't have been so easy for AMD to catch up with. But no it had to be a laptop die the whole time.
I think it was a release timing thing. Like he's saying Broadwell should have had a full desktop release in 2014 despite the 14 nm yields obviously not being great then.They simply didn't have desktop Broadwell, which they said regretted the decision. 5775C is a scaled up HQ chip, which tells why it clocks poorly.
Of course it wasn't, for one simple fact, it brought AVX2 to the masses. Instruction set that is widely used today, even in games.And the 4770K was mediocre.
Not only that. AVX2 + FMA were best known, but it also had Intel's attempt in transactional stuff - TSX. it has since fell from favour due to bugs and lack of performance gains, but at least Intel tried to do something innovative at the time when they were dominating AMD.Of course it wasn't, for one simple fact, it brought AVX2 to the masses. Instruction set that is widely used today, even in games.
I have and will always challenge the igpu being important for OEMs at least as far as Business computers go. The extra $20-$30 for a dGPU means little to a business overpaying for a Business computer in the first place. All desktops hell all laptops my business orders outside our most mobile have dGPU.The DIY market really isn't that big. The IGP is pretty necessary for OEMs. There's a reason that Intel is still making a ton of money in desktops even now.
I mean there's a reason OEMs have largely avoided using Ryzen CPUs with the exception of discrete gaming focused computers.I have and will always challenge the igpu being important for OEMs at least as far as Business computers go. The extra $20-$30 for a dGPU means little to a business overpaying for a Business computer in the first place. All desktops hell all laptops my business orders outside our most mobile have dGPU.
Yep but I'm not sure the iGPU is the biggest reason.I mean there's a reason OEMs have largely avoided using Ryzen CPUs with the exception of discrete gaming focused computers.
EXACTLY! They missed a HUGE opportunity to undercut AMD by not having patience. Now they've spent millions of dollars to produce a disappointing 11 series that hurt their reputation. If any thing, they should have just made some minor tweaks to the 10 series and rolled with it until the 7µ products are ready in a just a few more months.Yes, but they knew in advance what the end result will be, they are not stupid, they can simulate all this stuff.
What they did seems completely irrational to me. What would be rational in my opinion is to keep selling 10th gen products, and price them so low that they make sense perf. per dollar wise compared to AMD products.
7nm will not be available until well into 2022-23 time frame. Alder Lake, 10nm, should be available toward the end of the year. I agree though, that they probably should have skipped Rocket Lake in the desktop.EXACTLY! They missed a HUGE opportunity to undercut AMD by not having patience. Now they've spent millions of dollars to produce a disappointing 11 series that hurt their reputation. If any thing, they should have just made some minor tweaks to the 10 series and rolled with it until the 7µ products are ready in a just a few more months.
Because getting better takes practice and engineers learning by doing is real.Seeing the preliminary results in the Anandtech 11700K review, the performance of this CPU compared to the previous CPU overall seems to be just marginally better, in some cases even worse than the previous generation product.
What is the economic sense and impact on the company of the decision to backport this new CPU to the 14 nm technology? How much money does this backport actually cost? Was this cost really worth it, when the only result is that you will have some "fill in product" for a few months before the Alder lake comes?
If it does not make sense financialy, then why did they do it? They must have known well that the end result of this will be.