Discussion Intel - the cost of BACKPORTING

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kocicak

Senior member
Jan 17, 2019
982
973
136
Seeing the preliminary results in the Anandtech 11700K review, the performance of this CPU compared to the previous CPU overall seems to be just marginally better, in some cases even worse than the previous generation product.

What is the economic sense and impact on the company of the decision to backport this new CPU to the 14 nm technology? How much money does this backport actually cost? Was this cost really worth it, when the only result is that you will have some "fill in product" for a few months before the Alder lake comes?

If it does not make sense financialy, then why did they do it? They must have known well that the end result of this will be.
 
  • Like
Reactions: krumme

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Good point I knew Ivy Bridge was an improvement in power usage but forgot how disappointing it was for desktops. Always said it was the chip that Microsoft originally wanted for the first Surface but gave up waiting.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
So while we all know 10nm really shook things up and is actively making the situation worse for them, it's not the only reason they are struggling, had they increase core count when they made Broadwell instead of expanding gpu cores or creating a desktop variation of the server cores, it would have barely cost Intel anything and they wouldn't have been so easy for AMD to catch up with. But no it had to be a laptop die the whole time.

The DIY market really isn't that big. The IGP is pretty necessary for OEMs. There's a reason that Intel is still making a ton of money in desktops even now.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
They simply didn't have desktop Broadwell, which they said regretted the decision. 5775C is a scaled up HQ chip, which tells why it clocks poorly.

Sunny Cove is a nice jump performance wise at the same clocks, but is merely an expansion. It just happened to be a bigger expansion than before. Sandy Bridge is the one that brought new ideas.

In contrast, Zen is a radical departure(of course), Zen 2 brings a new type of branch predictor, Zen 3 is an expansion but not merely that and does it smart. I hope Golden Cove is more than just an expansion and does some better and smarter.

We can see based on Rocketlake the Sunny Cove core is less efficient, which is why it uses so much power and has to have two less cores. It works on 10nm because it's a smaller more power efficient process.
 
  • Like
Reactions: Tlh97

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
They simply didn't have desktop Broadwell, which they said regretted the decision. 5775C is a scaled up HQ chip, which tells why it clocks poorly.

I think it was a release timing thing. Like he's saying Broadwell should have had a full desktop release in 2014 despite the 14 nm yields obviously not being great then.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Of course it wasn't, for one simple fact, it brought AVX2 to the masses. Instruction set that is widely used today, even in games.

Not only that. AVX2 + FMA were best known, but it also had Intel's attempt in transactional stuff - TSX. it has since fell from favour due to bugs and lack of performance gains, but at least Intel tried to do something innovative at the time when they were dominating AMD.
In my experience Haswell really required recompilation. For example in my company we had some specialized code, that was able to use BMI/BMI2 instructions when compiled with ICC for Haswell target. (on server hw) Going from Westmere to Haswell gained a lot of performance, but with recompilation for Haswell it was wow, picked another 30% of performance just by adding target arch.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
The DIY market really isn't that big. The IGP is pretty necessary for OEMs. There's a reason that Intel is still making a ton of money in desktops even now.
I have and will always challenge the igpu being important for OEMs at least as far as Business computers go. The extra $20-$30 for a dGPU means little to a business overpaying for a Business computer in the first place. All desktops hell all laptops my business orders outside our most mobile have dGPU.

And at that point they don't need Iris or really anything. People wanted AMD to throw a Vega 3 or something on desktop Ryzen for similar reasons (due to pin requirements not an option till AM5). So fine throw a igpu in all of them. But when push came to shove, when they absolutely had to increase size, when AMD was almost dead. They could have have increased both the GPU cores and CPU cores and still came within their "budget". But instead went all in with the GPU.

Consumer desktops were dropped like a rock by Intel. Rocket lake is more a challenge of Renoir and Picasso then it is Vermeer. Only saving grace is how high they clocked past their efficiency point. Funny considering it won't see laptops, at least any time soon, leaving Intel with a max of 4 cores till, what I am assuming was an emergency development (much like rocket lake), 8 core tiger lake which considering their eversion to large 10nm chips, was the last thing they wanted to do.
 
  • Like
Reactions: Tlh97

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
I have and will always challenge the igpu being important for OEMs at least as far as Business computers go. The extra $20-$30 for a dGPU means little to a business overpaying for a Business computer in the first place. All desktops hell all laptops my business orders outside our most mobile have dGPU.

I mean there's a reason OEMs have largely avoided using Ryzen CPUs with the exception of discrete gaming focused computers.
 

BadThad

Lifer
Feb 22, 2000
12,093
47
91
Yes, but they knew in advance what the end result will be, they are not stupid, they can simulate all this stuff.

What they did seems completely irrational to me. What would be rational in my opinion is to keep selling 10th gen products, and price them so low that they make sense perf. per dollar wise compared to AMD products.

EXACTLY! They missed a HUGE opportunity to undercut AMD by not having patience. Now they've spent millions of dollars to produce a disappointing 11 series that hurt their reputation. If any thing, they should have just made some minor tweaks to the 10 series and rolled with it until the 7µ products are ready in a just a few more months.
 

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
EXACTLY! They missed a HUGE opportunity to undercut AMD by not having patience. Now they've spent millions of dollars to produce a disappointing 11 series that hurt their reputation. If any thing, they should have just made some minor tweaks to the 10 series and rolled with it until the 7µ products are ready in a just a few more months.
7nm will not be available until well into 2022-23 time frame. Alder Lake, 10nm, should be available toward the end of the year. I agree though, that they probably should have skipped Rocket Lake in the desktop.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Now we know the true cost of backporting.

Summary: Don't do it.

Tigerlake-H Base frequency
@ 65W: 3.3GHz
@ 45W: 2.6GHz

For 11900K
@125W: 3.5GHz
@95W: 3GHz

11900T
@35W: 1.5GHz
@25W: 1GHz

11900
@65W: 2.5GHz

So you get 30% frequency gain at same power or nearly halve the power at same frequency. At lower power levels, the gain in clocks is 40%. The 11900H has a base frequency of 2.1GHz.

We can see 10nm SF is the true full node jump from 14nm. And I mean pre-FinFET days of full node jumps.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Seeing the preliminary results in the Anandtech 11700K review, the performance of this CPU compared to the previous CPU overall seems to be just marginally better, in some cases even worse than the previous generation product.

What is the economic sense and impact on the company of the decision to backport this new CPU to the 14 nm technology? How much money does this backport actually cost? Was this cost really worth it, when the only result is that you will have some "fill in product" for a few months before the Alder lake comes?

If it does not make sense financialy, then why did they do it? They must have known well that the end result of this will be.
Because getting better takes practice and engineers learning by doing is real.

Intel is going to be targeting several nodes on its own foundries, and several nodes on TSMC, and who knows what foundry process is below Intel 7nm. (Remember Intel 7nm if it works is between TSMC 5 and 3 nm with density)

Likewise the more expensive Intel cpus will be using Intel Foveros packaging tech. Foveros increase costs but you can stack multiple dies together and they can be built on different foundries such as making the IO an older generation even if you want the cpu and gpu be the latest generation of IP and best foundries out there.

Think of the back port as R&D practice for their engineers. Experience to gain for they will now needing to be doing this on 4 or even more foundries in the future. 2 different gens for Intel Foundries, and 2 different gems for TSMC.