• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[AT] Samsung Announces 14nm FinFET for Exynos 7

Impressive. Will likely be out in early March, just after MWC. Samsung is holding a press conf 1 March. Intel's process tech lead is shrinking fast.

It's interesting that Qualcomm/Samsung/Apple/AMD stood still at 28 nm for so long, and now all of a sudden they are forging ahead to 14 nm, passing 20 nm quickly.
 
They should have called it 20FF. Its even worse than TSMC 16FF.

Will be interesting to see what volumes they can supply and when. Lets hope it wont be limited to the Korean market.

I wonder how much of their finfet process they stole from TSMC.
 
Last edited:
They should have called it 20FF. Its even worse than TSMC 16FF.

Will be interesting to see what volumes they can supply and when. Lets hope it wont be limited to the Korean market.
Well, Samsung's not using Qualcomm for their Galaxy 6, so it's definitely coming to NA + elsewhere at some point in time.
 
They should have called it 20FF. Its even worse than TSMC 16FF.

Will be interesting to see what volumes they can supply and when. Lets hope it wont be limited to the Korean market.

I wonder how much of their finfet process they stole from TSMC.

In what way is it worse than TSMC's 16FF?
 
For compare, The 20nm Exynos 7410 at 20nm is clocked at 1.9Ghz. I wonder if the GPU got a major upgrade since the 7420 is only clocked at 2.1Ghz.
 
Impressive. Will likely be out in early March, just after MWC. Samsung is holding a press conf 1 March. Intel's process tech lead is shrinking fast.
Come on, you're still saying this? At the very least, you should find your stance to be highly debatable, rather than continuing to obstinately present it as fact.

TSMC's 10nm isn't coming until 2017. Samsung's 10nm will show up sometime in 2016, but their specifications are much more lax. Intel's is likely to show up late 2016/early 2017, but will easily outperform their two closest competitors.
It's interesting that Qualcomm/Samsung/Apple/AMD stood still at 28 nm for so long, and now all of a sudden they are forging ahead to 14 nm, passing 20 nm quickly.
Samsung's 20nm was very late -- their 14nm came very early. Their quick abandonment of 20nm makes a lot of sense.

AMD is still up in the air. Nobody knows what they're doing; heck, even they don't know what they're doing.

Qualcomm's been using 20nm for about a year to some extent. They are going to be on that node for some time still.

Apple's been on 20nm for a year, and will likely be switching to Samsung's 14nm after Samsung's had their turn. So, Q2 perhaps, with products showing up this fall.

Just laying things out -- not necessarily disagreeing with you here. Adoption of 16/14nm should be very quick -- FinFETs are the biggest advancement in process tech since transistors were conceived.
 
Last edited:
http://electroiq.com/blog/2014/02/the-most-expensive-sram-in-the-world-2-0/
https://www.semiwiki.com/forum/content/3759-intel-versus-tsmc-14nm-processes.html

Samsung strategicly enough supplied 2 SRAM versions.
Samsung 14FF HD 0.064um2
Samsung 14FF HP 0.080um2
TSMC 16FF 0.070um2
Intel 14FF 0.0588um2

The 16FF from TSMC is the High density cell (HD). Intel's high density cell is 0.05um^2. All processes have high density, low voltage, and high performance SRAM cells 🙂

On an apples-to-apples basis, Intel 14nm is the leader, followed by Samsung 14nm, and then TSMC 16nm.
 
Intel's process tech lead is shrinking fast.
Only because Intel has had major issues with 14nm. TSMC and Samsung aren't immune to slow node cadences either, since it's been more than 3 years since AMD's 28nm GPUs now.

My suspicion is that 10nm might separate the strong from the weak.
 
Samsung has slightly denser SRAM it seems: 0.064um2 vs .070um2. Not sure on gate pitch.

TSMC's 16FF had a 64nm minimum metal pitch, 90nm gate pitch. Samsung 14nm has a 78nm gate pitch, 64nm minimum metal. Intel 14nm is at 70nm gate pitch, 52nm minimum metal.
 
People talk about Intel's shrinking lead, well Qualcomm's S810 problems and the fact that their GPU development is slowing down is also a story. People have been unwilling to buy Samsung stuff in the past because of Samsung's dominance as well as the fact that their SoCs were not great.

Will be interesting to see if that changes; it should. Qualcomm's high-end monopoly may come to an end, precisely because Samsung is doing so poorly right now compared to the past.

Intel having someone actually breathing down their neck for the first time in a long time will do them good, too re: process node.

All in all, this is good news, I hope Samsung blasts it out of the park and that the E7420 @ 14 nm will be far better and lead to increased competition.

Maybe Samsung will also, finally, abandon the silly Tizen project and start to create their own entirely custom CPU, like Apple or Qualcomm's Krait(the S810 is not Krait). And while they are at it, do a custom GPU, too.
 
Only because Intel has had major issues with 14nm. TSMC and Samsung aren't immune to slow node cadences either, since it's been more than 3 years since AMD's 28nm GPUs now.

My suspicion is that 10nm might separate the strong from the weak.
Well, they're all strong. I really want to make a point of that -- all three of these foundries are in great shape. TSMC had a great 28nm and 20nm, Samsung will have a great 14nm.

Intel's still new to the for-customers foundry game. At any rate, they've gone from basically zero customers to having somewhere around 10. When Altera puts out products later this year, we'll probably hear a number of other companies signing up.
Maybe Samsung will also, finally, abandon the silly Tizen project and start to create their own entirely custom CPU, like Apple or Qualcomm's Krait(the S810 is not Krait). And while they are at it, do a custom GPU, too.
Samsung announced a year or two ago that they were doing a custom CPU design. It's tough to say at this point, but their Exynos 7 might be the start of that.
 
Last edited:
Just laying things out -- not necessarily disagreeing with you here. Adoption of 16/14nm should be very quick -- FinFETs are the biggest advancement in process tech since transistors were conceived.
Not sure if anything can match HKMG's *gigantic* (100x, iirc) decrease in leakage, though. The switch to CMOS was also pretty big, I think.

Intel_technology_roadmap.gif
 
Not sure if anything can match HKMG's *gigantic* (100x, iirc) decrease in leakage, though. The switch to CMOS was also pretty big, I think.
Well, CMOS wasn't anything new, prior to its mass-adoption. It just was too expensive initially.

HKMG was definitely a big deal as well -- we'd be stuck without it. I think in terms of end-user benefit, though, FinFETs are a bigger deal.
 
HKMG was definitely a big deal as well -- we'd be stuck without it. I think in terms of end-user benefit, though, FinFETs are a bigger deal.
Well, S800 was a decent increase in battery life, power consumption. III-V and EUV might be an even bigger deal -- without EUV, no more Moore's Law.
 
Having issues finding the original source, but if you google "intel 0.050 µm2," you'll get quite a few hits from second-hand sources.
Well, S800 was a decent increase in battery life, power consumption. III-V and EUV might be an even bigger deal -- without EUV, no more Moore's Law.
Post-silicon will be truly tremendous. EUV will be great for costs.
 
An interesting observation...

0.05/0.064 = 0.78125 (ratio of HD SRAM cell sizes)

(52)(70)/(64)(78) ~= 0.73 (ratio of gate pitch * minimum metal pitch metrics).

Now, what's really interesting is that Intel's 14nm M0 is 56nm (M1 is 70nm; M2-Mx can be 52nm). If you look at the following ratio:

(56)(70)/(64)(78) ~= 0.785

Corresponds pretty darn closely to the HD SRAM cell ratios...
 
Back
Top