Discussion Intel - the cost of BACKPORTING

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kocicak

Senior member
Jan 17, 2019
982
973
136
Seeing the preliminary results in the Anandtech 11700K review, the performance of this CPU compared to the previous CPU overall seems to be just marginally better, in some cases even worse than the previous generation product.

What is the economic sense and impact on the company of the decision to backport this new CPU to the 14 nm technology? How much money does this backport actually cost? Was this cost really worth it, when the only result is that you will have some "fill in product" for a few months before the Alder lake comes?

If it does not make sense financialy, then why did they do it? They must have known well that the end result of this will be.
 
  • Like
Reactions: krumme

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
What can you do?!
Those are features of the Xe cards/igpu, features that are actually working, with drivers and everything, and with programs that are already using them.

Oh I dunno.. actually deliver one of those features on silicon.

You might want to check the actual compute throughput needed on a typical scientific/content workstation versus the capabilities of an Intel integrated GPU before you claim that weakling can handle "workstation" loads.
 

grant2

Golden Member
May 23, 2001
1,165
23
81
Why does the overall market share peak above all the shares in the segments? Wouldn't you expect at least one segment to be above the overall share?
Yes. The graphs says every segment has < 20% market share, but somehow the "overall" is > 20%. So it is inaccurate or misleading somehow.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
731
126
Oh I dunno.. actually deliver one of those features on silicon.

You might want to check the actual compute throughput needed on a typical scientific/content workstation versus the capabilities of an Intel integrated GPU before you claim that weakling can handle "workstation" loads.
include an iGPU that can handle some amount of workstation workloads as well,
Details, devils, and such.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,727
1,456
126
didn't have time to read through all the illuminating posts, but I think I get the gist of the OP's thread and post #1.

Delving into the technical details may explain some things. But consider some economic imperatives over history and time and several industries. You could read some economics classics from Thorstein Veblen or Frank Knight. There are imperatives in both the profitability and survive-a-bility of mature industries. Take for instance the automobile. After WWII, the major manufacturers embarked on a pattern of product differentiation. They produced as many different models as they could, even for using the same parts from their sub-contract suppliers across the various product lines. They had to keep refreshing demand. Sales had to expand year after year. They couldn't let sales shrink.

Here, we have two major competitors -- AMD and Intel -- seemingly comfortable with their market shares. But they have to come up with something new every year.

So maybe -- that's where we are, and that's what "this is".

Honestly, I've been posting here over the past month, having taken a two-year sabbatical. I've been rocking a 2016/2017 Skylake build. I was careless, and killed the motherboard. As we discussed possibilities for a 10th-gen "re-build", I only stand firm with the idea that I spend months planning to put together a new system, even if I could just order a buncha-parts and slap something together over a week's time. I stuck with Skylake and Kaby Lake. Curiosity got the best of me so I bought the Kaby to refresh the replacement identical motherboard. It was fun -- I stand firm with that.

Even today, if I wanted to build a new system with a more recent processor and motherboard, I would pick a hexa-core i5-10600K and go forward continually telling myself that I couldn't use all that processing power.

Would it be that Intel's innovation impulse has stalled? That it has stalled simply because there's not much more that can be done with the technology at this point? I'm speculating. I'm guessing. I could be wrong. But -- think about it .; . . . .
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
An iGPU is just fine for the majority of the workstation tasks.

Large spread sheets - for many people, the tool they know how to use is better then the ideal tool, very common in the business world. It is all fun and games until you encounter one with a 5 minute refresh time.

CAD programs - the models stress the CPU far more as the complexity increases, most parts do not get textures and are not geometrically complex. There is just lots and lots of them. Also very common to have a script spit out the parts list for the model to a massive spreadsheet ...

Photoshop/graphics design - they are 2d layouts, rendering is less important then CPU, main memory, and SSD.

Programming - the GPU does nothing for compiling, web page layout, etc. Most programmers never optimize things, and most tasks do not justify the cost of developing for GPU processing.

Word processing - when the QA guy sends you a massive 2000 page book with tons of camera resolution imagines showing every piece of glass in your entire facility it hits like a brick.

Business apps - many will pull or push data from a Access / sql lite or whatever. Anything that needs to open a large XML files needs ump. The use of local databases is very common.

Yea, people could optimize things, but that takes time. Labor is more expensive then computer parts. Far better off having employees turn out content/do their job then training them how to resize pictures or fiddle with algorithms. It is always cheaper to buy a bigger computer then optimize anything.
 
Last edited:

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
An iGPU is just fine for the majority of the workstation tasks.

Large spread sheets - for many people, the tool they know how to use is better then the ideal tool, very common in the business world. It is all fun and games until you encounter one with a 5 minute refresh time.

CAD programs - the models stress the CPU far more as the complexity increases, most parts do not get textures and are not geometrically complex. There is just lots and lots of them. Also very common to have a script spit out the parts list for the model to a massive spreadsheet ...

Photoshop/graphics design - they are 2d layouts, rendering is less important then CPU, main memory, and SSD.

Programming - the GPU does nothing for compiling, web page layout, etc. Most programmers never optimize things, and most tasks do not justify the cost of developing for GPU processing.

Word processing - when the QA guy sends you a massive 2000 page book with tons of camera resolution imagines showing every piece of glass in your entire facility it hits like a brick.

Business apps - many will pull or push data from a Access / sql lite or whatever. Anything that needs to open a large XML files needs ump. The use of local databases is very common.

Yea, people could optimize things, but that takes time. Labor is more expensive then computer parts. Far better off having employees turn out content/do their job then training them how to resize pictures or fiddle with algorithms. It is always cheaper to buy a bigger computer then optimize anything.

I suppose... if you consider Dell Inspirons to be "workstations". In that case, they do not need any vector compute at all so name-dropping Intel's band-aid compute solutions is totally irrelevant.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
I suppose... if you consider Dell Inspirons to be "workstations". In that case, they do not need any vector compute at all so name-dropping Intel's band-aid compute solutions is totally irrelevant.

I do not know much about Dell, more familiar with HPs enterprise lineup myself.

However, if I remember right Dell sells more 16x more iGPU workstations then the entire DIY PC market combined. It would appear the vast majority of buyers feel iGPU is more then good enough for their workstation tasks.

As for vector compute, it is used everywhere. Common programs include the entire adobe product line including photoshop, 7zip and most file compression programs, media players including VLC, Excel, most programming math librarys, Java Virtual Machine, C# virtual machine, etc. AVX256 is commonly used for disk encryption, file encryption, VPNs, network encryption, etc. If I remember right, Chrome uses it for https connection decryption/encryption. OpenSSL uses it also.

There is a reason both AMD, Intel, and Apple all have vector compute built right in their CPUs. It is ubiquitous.

-----------------------------

Yea, people could optimize things, but that takes time. Labor is more expensive then computer parts. Far better off having employees turn out content/do their job then training them how to resize pictures or fiddle with algorithms. It is always cheaper to buy a bigger computer then optimize anything.

The above statement applies to individual users optimizing their specific individual projects. Which is what dGPU optimization is these days.

The tools above are upstream tools are being "optimized" by the vendor because said vendor only has to "optimize" their tool one time to effect all of their users. Of course, if all of the users have avx256, it could be argued it is less optimization and more implementing the standard.

Most of these tools offer no gain for dGPU over iGPU, as such for the user their is no point in owning a dGPU. Most of them are unsuitable for GPU processing, and will never use it. ex: taking avx256 decryption from the secure cpu sandbox and sending it to a unsecure GPU is just stupid. Virtual machines also have to keep their sandboxes. Most programs mix vector instructions in with non-vector instructions in a manner that makes it unsuitable to send to them to a GPU for special processing.
 
Last edited:

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
I do not know much about Dell, more familiar with HPs enterprise lineup myself.

However, if I remember right Dell sells more 16x more iGPU workstations then the entire DIY PC market combined. It would appear the vast majority of buyers feel iGPU is more then good enough for their workstation tasks.

As for vector compute, it is used everywhere. Common programs include the entire adobe product line including photoshop, 7zip and most file compression programs, media players including VLC, Excel, most programming math librarys, Java Virtual Machine, C# virtual machine, etc. AVX256 is commonly used for disk encryption, file encryption, VPNs, network encryption, etc. If I remember right, Chrome uses it for https connection decryption/encryption. OpenSSL uses it also.

There is a reason both AMD, Intel, and Apple all have vector compute built right in their CPUs. It is ubiquitous.

-----------------------------



The above statement applies to individual users optimizing their specific individual projects. Which is what dGPU optimization is these days.

The tools above are upstream tools are being "optimized" by the vendor because said vendor only has to "optimize" their tool one time to effect all of their users. Of course, if all of the users have avx256, it could be argued it is less optimization and more implementing the standard.

Most of these tools offer no gain for dGPU over iGPU, as such for the user their is no point in owning a dGPU. Most of them are unsuitable for GPU processing, and will never use it. ex: taking avx256 decryption from the secure cpu sandbox and sending it to a unsecure GPU is just stupid. Virtual machines also have to keep their sandboxes. Most programs mix vector instructions in with non-vector instructions in a manner that makes it unsuitable to send to them to a GPU for special processing.

I think we have different definitions of workstation. By vector compute, I meant GPGPU style computing which can only be handled by a massively threaded machine (i.e. not a CPU).
 
  • Like
Reactions: dlerious

racetrack

Junior Member
May 3, 2020
1
0
11
I think we have different definitions of workstation. By vector compute, I meant GPGPU style computing which can only be handled by a massively threaded machine (i.e. not a CPU).

Since when does using a workstation imply GPGPU computing or vectorized workloads? All of our workstations use iGPUs because nothing else is needed.

In my mind, workstation implies quick and easy parts replacement (plastic clips not screws) and reliability of results (more extensive validation for workstation parts, and use of ECC memory). It has nothing to do with GPUs. For every task you can think of that can be accelerated with a GPU, there are probably 100x more that can't be, or shouldn't be.
 

Kocicak

Senior member
Jan 17, 2019
982
973
136
The topic of this thread is backporting the Rocket lake to 14 nm and its economic sense, and also the sense of the Rocket lake as a possibly short lived product, which does not bring much more on the table than the last generation product.

I would be interested to learn something about a technical side of the backporting process too. It there any technical info about how the 10 and 14 nm technology looks, for example how the transistors look like, how do the the layers and conductors in the silicon look etc.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
The topic of this thread is backporting the Rocket lake to 14 nm and its economic sense, and also the sense of the Rocket lake as a possibly short lived product, which does not bring much more on the table than the last generation product.

I would be interested to learn something about a technical side of the backporting process too. It there any technical info about how the 10 and 14 nm technology looks, for example how the transistors look like, how do the the layers and conductors in the silicon look etc.

The 10nm transistors are smaller than the 14nm transistors :D (that's basically all joe public knows for sure). What we don't know is the issue with the 10nm process..

Are yields terrible, hence why only the most expensive server CPU's will be using it at first?
Are clock speeds unable to match 14nm for 8C dies, hence 14nm being superior?

As far as I know, we've never had definitive answers to the above, just speculation.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Are yields terrible, hence why only the most expensive server CPU's will be using it at first?

Huh?

10nm was first used in Cannonlake-U
10nm+ was first used in Ice Lake-U
10SF was first used in TigerLake-U
10SFE will be first used in Alder Lake (presumably -P/-M but maybe -S)

Yields being "terrible" is what has apparently lead Intel to go with smaller dice.

Intel hasn't launched new processes as "server first" since . . . I don't remember when actually. Even 14nm was mobile-first (Broadwell-U).
 
  • Like
Reactions: scineram

Kocicak

Senior member
Jan 17, 2019
982
973
136
Rocket Lake, in the context of backporting, has been a ‘good attempt’ – good enough to at least launch into the market.
Dr. Cuttress wrote this in the new review and I must say I do not understand it. What are these CPUs going to do in the market???

It is like publisher said: This book is horrible and nobody will really want to read it, but good enough to be printed and displayed on the shelf in the shop. We will print a pretty jacket for it.

What would be the point of that???
 
  • Like
Reactions: krumme

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
All these choices are made years in advance and if was clear that Rocket lake wasn't going to be competitive with Cometlake or Zen 3 they probably cancel it like they have done half a dozen die since 10nm showed up screwed up.

So let's say 8 months to a year ago. They had already got past the point of no return. Zen 2 was better than they thought but were too far into testing with OEMs to pull back and a platform designed specifically for it already on the market.

It exists because they thought it would be better if only marginally so, needed to keep a release schedule, and already spent millions in development of it.

It's the same reason AMD and Nvidia keep releasing new GPUs even though dedicating all wafers to 3090's and 6900xts would be better margin. These things were set in motion on trains that can't stop easily long before the market we are in became the market we are in.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
731
126
Dr. Cuttress wrote this in the new review and I must say I do not understand it. What are these CPUs going to do in the market???

It is like publisher said: This book is horrible and nobody will really want to read it, but good enough to be printed and displayed on the shelf in the shop. We will print a pretty jacket for it.

What would be the point of that???
Huh?! How many of the people that use computers do you think actually use any sort of multithreading?! Even when including gaming where GPUs are more of a limit then anything else?
People commit to a budget and then buy whatever is closest to that, either in parts and more often then not as an OEM system. Rocket lake will sell like crazy, definitely at least as well as comet which will also continue to be made and sold.

Being much better in something that nobody cares about is not a reason for people to choose it.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Huh?! How many of the people that use computers do you think actually use any sort of multithreading?! Even when including gaming where GPUs are more of a limit then anything else?
People commit to a budget and then buy whatever is closest to that, either in parts and more often then not as an OEM system. Rocket lake will sell like crazy, definitely at least as well as comet which will also continue to be made and sold.

Being much better in something that nobody cares about is not a reason for people to choose it.

. . . what?
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
Huh?! How many of the people that use computers do you think actually use any sort of multithreading?! Even when including gaming where GPUs are more of a limit then anything else?
People commit to a budget and then buy whatever is closest to that, either in parts and more often then not as an OEM system. Rocket lake will sell like crazy, definitely at least as well as comet which will also continue to be made and sold.

Why did Intel not just stick with what they had?

Smaller die, could produce more and sell at a reduced price and still make more money... and not be out the cost of the backport.


Oh right, because it appears performance does matter. :rolleyes:
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
To show shareholders you can release new products.

To keep stock price up and keep value of company.

Releasing skylake again under a new name would stricktly give more sense on the short term. Heck they have done it 5 times so a 6 wouldnt make a difference and would be dirt cheap.

But reasons like the above aparently pressures them to release it. Surely its an important signal.
 
  • Like
Reactions: Tlh97 and Thibsie

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
As an investor I would not like to see Intel burning millions (10s of millions) developing and making pointless product.
But decisions like that are made before they have they know the end binning, and where their competitors are going to be at. In a perfect world this would clock just a little bit higher and AMD doesn't keep making these 20% jumps. Heck when the choice was set in stone they didn't know if AMD could do it once. Likely they were beyond the point of no return before Zen 2 launched.