Intel Z3770 vs Snapdragon 800 in Kraken 1.1

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
That GPU is kinda disappointing, but I will wait with opinion until it releases.

btw. Does Intel does any architectural performance advancements with it's GPUs with each generation? Of course aside of making them able to to process higher DX and OGL.
What I mean does 4 Haswell EUs would be stronger than 4 Ivy Bridge EUs. etc

Lack of more in-depth info about Intel GPU tech is irritating.
Intel's current GPUs follow a model similar to their CPUs' tick-tock model. Every two years brings an large-scale architecture change, with the in-between years bringing tweaks.

4 Haswell EUs are better than 4 Ivy Bridge EUs. I don't know the numbers, but revisions were made that brought up performance. In addition, Haswell's EUs clock higher (IIRC).
isnt this going to be pretty straightforward within the next two years or am i missing something.

Based on AMAT/LAM comments it doesnt seem like the industry is going to be adding a ton of 20nm and 20nm/finfet capacity next 12 months.
I think TSMC was stating that 20nm will be their quickest ramp ever.

Not sure what you mean by 20nm/finfet capacity, though -- Intel is the only one with FinFETs right now, and they aren't planning on expanding 22nm capacity AFAIK. However, they have plenty of 22nm capacity available.
Intel's has the biggest transistor budget and will have widening budget to by 2h of next year even accounting for the minimal volumes we're going to get at 20nm from TSMC

If i look at the kraken, and IDF comparison slides, they already have superior cpu performance and at S600 or better GPU performance.

If ur getting your cpu muarch iterated next year along with potentially two gpu muarch iterations i dont see how this isnt intel's game to win from a performance, efficiency and cost point of view??
I don't see TSMC having low volumes on 20nm. Apple alone will account for a large portion of their sales, which is a huge customer that they did not have before.

CPU performance is in Intel's favor by a pretty wide margin. Their 22nm LP process is far and away better than anything else on the market right now. TSMC's 20nm will help close the gap, but we won't see those designs until after Silvermont is already on the market. Intel's 22nm process will still be superior to TSMC's 20nm process, regardless of time to market.

As far as their GPU goes, it isn't awfully inspiring in my opinion. I don't know how it compares architecturally, but I'd imagine it's not at the top of the list by any means. From what I'm understanding, Intel hasn't devoted a lot of space on the SoC to graphics, so their lackluster performance is a bit of an amalgamation of several crippling factors, however it is largely alleviated by their process advantage.
I am not a tech person, i'm an investor but the sheer amount of negativity blasted towards intel is baffling?
There are a lot of members that are bitter about AMD going downhill, and it taints their ability to see the industry as a whole in an objective light. There are a lot of people who wish for Intel to fail, and that results in a rather painful load of confirmation bias.

It's not just here, either. There are a lot of people -- investors like yourself -- who are betting on their favorite horse.

As far as who will win the SoC war, the answer is pretty obvious in my opinion. That answer will become very clear at the end of this year.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Intel wants to maintain a high ASP/Gross Margin even against ARM. Having an underpowered GPU won't help keep the line on price.
 

liahos1

Senior member
Aug 28, 2013
573
45
91
"I think TSMC was stating that 20nm will be their quickest ramp ever.

Not sure what you mean by 20nm/finfet capacity, though -- Intel is the only one with FinFETs right now, and they aren't planning on expanding 22nm capacity AFAIK. However, they have plenty of 22nm capacity available.
I don't see TSMC having low volumes on 20nm. Apple alone will account for a large portion of their sales, which is a huge customer that they did not have before."

On TSM's last earnings call they said 20nm planar would be high single digit of revenue next year. 2ish percent of revs by q2. LAM recently brought down their expected guidance for etching tools from 30-50k wafer starts to ~30. They said the lowered guidance is not because anyone was "skipping" to go straight to finfet, just that they were delaying. I'm sure 20nm will be a big node, just not next year.

AMAT recently said they expect industry wide capacity for 20nm, 16nm wafer starts to approach 100k. All in this would (assuming its all going to TSMC) account for 14% of their total capacity by year end.

Some of this capacity is going to be eaten up by FPGA, GPU guys etc. Also if the initial ramps at 28nm were any indication to go buy it doest seem like intel should expect a ton of 20nm competition when it begins sampling airmont parts? maybe im wrong thats just the chatter i've been getting.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Sure it won't help, but they make up for it more than enough in other areas.

Graphical activities are a higher portion of tablet/phone activity than low end conventional notebooks and PCs, on average across the consumer base.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
On TSM's last earnings call they said 20nm planar would be high single digit of revenue next year. 2ish percent of revs by q2. LAM recently brought down their expected guidance for etching tools from 30-50k wafer starts to ~30. They said the lowered guidance is not because anyone was "skipping" to go straight to finfet, just that they were delaying. I'm sure 20nm will be a big node, just not next year.
If TSMC does in fact have a relatively small 20nm volume, that will be great news for Intel. Samsung is seemingly still stuck in the mud, and GloFo has been effectively washed out to sea.
Also if the initial ramps at 28nm were any indication to go buy it doest seem like intel should expect a ton of 20nm competition when it begins sampling airmont parts? maybe im wrong thats just the chatter i've been getting.
Perhaps, but Silvermont should hold its own against the ARM club's 20nm parts. I doubt they'll hold onto the performance crown, but cost will be heavily in Intel's favor. 22nm will be a very old node for Intel by then.
Graphical activities are a higher portion of tablet/phone activity than low end conventional notebooks and PCs, on average across the consumer base.
I'm not questioning the veracity of your statement, but I'd be interested in seeing a source. To me, graphics don't seem to be a huge deal, but I also don't use my handheld devices for gaming. Wouldn't battery life and cost be far more important factors?
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Graphical activities are a higher portion of tablet/phone activity than low end conventional notebooks and PCs, on average across the consumer base.

Define "Graphical activities". watching videos, the whole UI, recording videos,...all that stuff does not require that much GPU power at all. Tell me one thing besides gaming that requires a ton of GPU power an a smart phone...

I just don't see why you need it and even more why I should pay for it. IMHO intel is making the right decision to save money on GPU. It's good enough. Single-threaded IPC is far, far more important still than anything else and here intel rules also because of turbo.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
untitled-25.png


http://www.businessinsider.com/the-future-of-mobile-slide-deck-2013-3

There's no point in pretending otherwise, gaming is the #1 tablet activity by a huge margin. This has been linked before by others, remember it so we don't have to keep going around in circles with "nobody needs good graphics on a tablet" any longer.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
There's no point in pretending otherwise, gaming is the #1 tablet activity by a huge margin. This has been linked before by others, remember it so we don't have to keep going around in circles with "nobody needs good graphics on a tablet" any longer.
Well, aren't you a ray of sunshine.

Why don't you tell me what percentage of those games are 3D?

Ooh, here's a fun graph, from the same source, even!
slide.jpg


Gee, looks like not many people game after all!
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Krait is really not a heavy hitter in all categories, it does pretty badly in some benchmarks and decently in others - Cortex-A9 even has better IPC in a few. Krait 400 changes almost nothing uarch-wise vs Krait 300, it just adds more clock speed. Intel has incredible efficiency scaling in to the higher MHz, this is even obvious in the much criticized Saltwell, and in Silvermont it should be better yet.

Is 60% performance here exaggerated? Probably not really, even Saltwell is beating Krait on some benches. I wish I knew more about Qualcomm's CPU - they reveal very little - but I have to guess it has some glass jaws, at the very least. Hasn't stopped them from being immensely successful, though. Who knows what they're planning for their next more substantial CPU revamp.

All of that said, I do think there are some nuances in Javascript (as I've always said) that make it not really THAT representative for benches. And while I hate to complain about pretty much every benchmark used I'm going to have to give the usual aside even about SPECInt. It's a good benchmark but it's indisputable that ICC generates much better code for it on x86 than GCC (VIA did a paper that showed exactly that, ~20% performance advantage on Nano) and that Intel is going to use ICC for any promotional numbers. That doesn't mean the comparison is illegitimate or they're doing anything wrong, but I still contend that very little Android software will use ICC except for those that are sponsored by Intel.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
I just find it hard to believe that this slide was "forgotten" already. It was barely a few weeks since it last did the rounds with the same argument.

Intel cannot hope to dominate tablet space with mid-ranked graphics. You are only as good as your weakest point in this, and they cannot rely on Nvidia saving them with discrete cards.

The S800 basically has no weaknesses. Even if the Z3770 is 50% faster in CPU benchmarks but has no discernible difference subjectively, the OEM's will simply not care when it is 70% slower in gaming, which is a much more obvious difference.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I just find it hard to believe that this slide was "forgotten" already. It was barely a few weeks since it last did the rounds with the same argument.
Yeah, well not everybody was around a few weeks ago, now were they? I sincerely apologize for not revolving around your schedule.
You are only as good as your weakest point in this
[Citiation needed]
The S800 basically has no weaknesses. Even if the Z3770 is 50% faster in CPU benchmarks but has no discernible difference subjectively, the OEM's will simply not care when it is 70% slower in gaming, which is a much more obvious difference.
Unfortunately for your argument, battery drain is an objective measurement. The quicker tasks are completed, the sooner the CPU returns to idle.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Well, aren't you a ray of sunshine.

Why don't you tell me what percentage of those games are 3D?

Ooh, here's a fun graph, from the same source, even!
slide.jpg


Gee, looks like not many people game after all!

Well we know from the other thread what you consider to be "not many".

In my book, some 35% of the "mobile population (54% of the US) must be pretty near the 70 million mark?

Yeah...not many. And that's just those with smartphones.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Krait is really not a heavy hitter in all categories, it does pretty badly in some benchmarks and decently in others - Cortex-A9 even has better IPC in a few. Krait 400 changes almost nothing uarch-wise vs Krait 300, it just adds more clock speed. Intel has incredible efficiency scaling in to the higher MHz, this is even obvious in the much criticized Saltwell, and in Silvermont it should be better yet.

Is 60% performance here exaggerated? Probably not really, even Saltwell is beating Krait on some benches. I wish I knew more about Qualcomm's CPU - they reveal very little - but I have to guess it has some glass jaws, at the very least. Hasn't stopped them from being immensely successful, though. Who knows what they're planning for their next more substantial CPU revamp.
Honestly, I think that Apple will become the leading ARM dev, in terms of performance.
Well we know from the other thread what you consider to be "not many".

In my book, some 35% of the "mobile population (54% of the US) must be pretty near the 70 million mark?

Yeah...not many. And that's just those with smartphones.
Not many compared to the 65% that don't. I don't know why relative terms are so difficult for you and your kind.

If your "Intel has bad GPUs" argument only applies to 35% of the population, then it's not a very strong one, now is it?

Since you seem to have missed this, I'm not saying graphics aren't important. But to say that they are the the top determiner in product sales is completely asinine.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,912
4,890
136
I wish I knew more about Qualcomm's CPU - they reveal very little - but I have to guess it has some glass jaws, at the very least. Hasn't stopped them from being immensely successful, though. Who knows what they're planning for their next more substantial CPU revamp.

28nm at glofo , possibly FDSOI since actual samples
of A9 dual core where produced and showed improvements
in TDP amounting to almost a full node when going from
bulk 28nm LP to 28nm FDSOI.
 

Abwx

Lifer
Apr 2, 2011
11,912
4,890
136
Unfortunately for your argument, battery drain is an objective measurement. The quicker tasks are completed, the sooner the CPU returns to idle.

Consuming two times more to reduce the time by 30%
is pointeless and will reduce the battery life.

There s no magic trick , upping frequency will follow
a non linear law , generaly a square law at best.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Not many compared to the 65% that don't. I don't know why relative terms are so difficult for you and your kind.

By that incredible slice of logic, I guess there are "not many" non-red stars in the entire universe.

Hint: you didn't specify anything else except "not many" (before I get accused of another strawman :p).

If your "Intel has bad GPUs" argument only applies to 35% of the population, then it's not a very strong one, now is it? Also, do note that the graph states "mobile," not "smartphone."
And when do you realise that you failed to understand that those people who don't game (note that 20% haven't even sent a text btw) still buy tablets and phones that are capable of gaming well?

Let's see. My sister has an iPad mini she doesn't game on, my brother has an iPad he doesn't game on. BUT WAIT A MINUTE - they have 2 kids who game on them and guess what? Yep you guessed it, they don't count towards those statistics. This pattern will be repeated all over the world.

The graph states mobile however my 54% number was for smartphones only btw. The number of mobile gamers in the US could easily be 100 million by now.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Blowout CPU performance, solid GPU, probably really good power consumption - yep, looks like a winner.

SiliconWars, weren't you saying that a 60% performance advantage in CPU wasn't likely? Well, there are your SPECInt results...of course, unless that benchmark - like Kraken - isn't representative.

All we need is a Geekbench 3 result, and then you can go about "cleaning up"...;)

Do you honestly believe that Intel would release results for something where they WEREN'T winning by a huge margin? Obviously the results they choose to show are wins for them. I also like how "solid GPU" (aka. only gets outperformed by a multiple you can count on your hands instead of previous attempts...) and "good power consumption" are already being touted. You're basing that on nothing but a spec sheet.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Consuming two times more to reduce the time by 30%
is pointeless and will reduce the battery life.

There s no magic trick , upping frequency will follow
a non linear law , generaly a square law at best.
Hey buddy, it's a shame you didn't know this, but if something's 50% faster at the same power draw, you can lower the frequency to stay on par with your competitor's performance, while using less than 50% of the power.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Since you seem to have missed this, I'm not saying graphics aren't important. But to say that they are the the top determiner in product sales is completely asinine.

Considering the big sellers have all been iPads, and they've all had good gaming performance on release, you can't realistically claim otherwise.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
By that incredible slice of logic, I guess there are "not many" non-red stars in the entire universe.

Hint: you didn't specify anything else except "not many" (before I get accused of another strawman :p).
Right, because stars have a lot to do with the context of this discussion.
And when do you realise that you failed to understand that those people who don't game (note that 20% haven't even sent a text btw) still buy tablets and phones that are capable of gaming well?
Okay now there's a straw man. I'm well aware of that, genius.

Now why don't you tell me why those kinds of people would prioritize GPU performance?
Let's see. My sister has an iPad mini she doesn't game on, my brother has an iPad he doesn't game on. BUT WAIT A MINUTE - they have 2 kids who game on them and guess what? Yep you guessed it, they don't count towards those statistics. This pattern will be repeated all over the world.
Sorry, you've lost me. I can't figure out what you're ranting about.
The graph states mobile however my 54% number was for smartphones only btw. The number of mobile gamers in the US could easily be 100 million by now.
Once again, how many of those games are 3D?
Considering the big sellers have all been iPads, and they've all had good gaming performance on release, you can't realistically claim otherwise.
Oh God, did you really just try to say that graphics performance is what's selling iPads?

Exophase, get over here. I need a better person to debate...
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Honestly, I think that Apple will become the leading ARM dev, in terms of performance.

I don't think expect this because I don't think they're aggressive enough in the power budget they give their CPUs.

It started with A5 where the dual Cortex-A9s were only clocked at 800MHz, while virtually the same core (same process, same layout, die shots show it looks the same) was clocked at 1.2GHz and higher in Samsung SoCs.

Similar story with Swift - it's a relatively strong uarch in terms of perf/MHz but it only clocks it 1.3GHz in phones and 1.4GHz in tablets. It's possible that Apple deliberately designed it to not scale well beyond this to optimize for better perf/MHz and ultimately perf/W.
 

liahos1

Senior member
Aug 28, 2013
573
45
91
I just find it hard to believe that this slide was "forgotten" already. It was barely a few weeks since it last did the rounds with the same argument.

Intel cannot hope to dominate tablet space with mid-ranked graphics. You are only as good as your weakest point in this, and they cannot rely on Nvidia saving them with discrete cards.

The S800 basically has no weaknesses. Even if the Z3770 is 50% faster in CPU benchmarks but has no discernible difference subjectively, the OEM's will simply not care when it is 70% slower in gaming, which is a much more obvious difference.

by that argument shouldn't have we seen more design wins for Tegra 4? It has a faster CPU and comparable GPU. ?
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I don't think expect this because I don't think they're aggressive enough in the power budget they give their CPUs.

It started with A5 where the dual Cortex-A9s were only clocked at 800MHz, while virtually the same core (same process, same layout, die shots show it looks the same) was clocked at 1.2GHz and higher in Samsung SoCs.

Similar story with Swift - it's a relatively strong uarch in terms of perf/MHz but it only clocks it 1.3GHz in phones and 1.4GHz in tablets. It's possible that Apple deliberately designed it to not scale well beyond this to optimize for better perf/MHz and ultimately perf/W.
Sorry, I should have chosen my words a bit better. What I meant is that I expect Apple to have the "best" ARM SoCs -- I expect them to be the most architecturally advanced, I suppose? Obviously that's not something that's easy to quantify...
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Right, because stars have a lot to do with the context of this discussion.

If you're going to change your argument to "relative numbers" then yes, they do.

Now why don't you tell me why those kinds of people would prioritize GPU performance?
Sorry, you've lost me. I can't figure out what you're ranting about.
You can't make the simple leap between those two?

Do you really think most tablet buyers are going in to shops and running SPECint on them before making a purchase decision?

Once again, how many of those games are 3D?
Why don't you do some work for yourself, google it and come back and tell us.

Oh God, did you really just try to say that graphics performance is what's selling iPads?
You really do lack basic logic don't you homeles. What I'm saying is that Apple made the choice to go with big graphics performance in their tablets. If gaming performance wasn't important then why waste a huge amount of die area on graphics? If you can't figure out what that means then that's your problem.
 
Status
Not open for further replies.