AMD Q415 results

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 10, 2006
11,715
2,012
126
If I have any disappointment with skylake, it is that the basic architecture did not bring as much improvement in top end performance as I had hoped *without* resorting to things like e-dram.

I mean, I can completely understand this, but at the same time its predecessors are quite fast/potent so it's a high bar to clear.
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
I mean, I can completely understand this, but at the same time its predecessors are quite fast/potent so it's a high bar to clear.

There s a relevant thread for the discussion that last since 10 posts + and has no relevance with this very thread, look like the same people as usual loves to derail AMD related threads...
 
Mar 10, 2006
11,715
2,012
126
There s a relevant thread for the discussion that last since 10 posts + and has no relevance with this very thread, look like the same people as usual loves to derail AMD related threads...

Sorry, your Highness, didn't know it was a crime to respond to posts in a thread that the moderators seem to be OK with.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
There s a relevant thread for the discussion that last since 10 posts + and has no relevance with this very thread, look like the same people as usual loves to derail AMD related threads...

Yep, it was a deicated member of the ADF that started in on Skylake performance. As you said, same people as always.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,783
254
126
First link is broken, second says nothing about this. No luck with your 'Wikipedia' references. :(
According to the Wikipedia article history, some Anandtech forum member (with IP 80.198.72.114) has now went ahead and ninja-removed the content from the Wikipedia article today, see modification here:

https://en.wikipedia.org/w/index.php?title=Skylake_%28microarchitecture%29&type=revision&diff=701448015&oldid=700828626

The desperation of some IST members on this forum is astonishing!

Anyway, here is the original article (i.e. earlier revision):
https://en.wikipedia.org/w/index.ph...oarchitecture)&direction=prev&oldid=701448015

and one of the sources:
http://www.pcadvisor.co.uk/news/pc-...ips-to-appear-in-tablets-pcs-servers-3600550/
and another one, not from Wikipedia:
http://www.pcworld.com/article/2893112/intels-skylake-chips-to-appear-in-tablets-pcs-servers.html

If you check the geolocation of IP 80.198.72.114 (e.g. using this: https://www.iplocation.net/ ), it says "Denmark". Now what usual suspect could that be...? ;)
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,348
10,048
126
Remember all the hype about Skylake being the most significant architectural change since the jump from Presler/Cedar Mill to Conroe? It wasn't.
No, it was/is.

Look up what they are doing with SGX, software guard extensions. A new way for undetectable, virtually un-killable (once we get persistent memory, with 3DXpoint) malware to infect our otherwise creaky Windows' systems.

Being the most significant architectural change since Conroe, is not talking about the performance leaps, but of the features.
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
Sorry, your Highness, didn't know it was a crime to respond to posts in a thread that the moderators seem to be OK with.

I could have picked some other post but yours was the last one at the moment, anyway to get back on topic here some statements by Lisa Su at her conference :


In client computing our opportunities to regain share in 2016 will be driven by our design win momentum, continued progress expanding into the commercial market and reentering the high performance desktop market late in the year with our Zen based Summit Ridge CPU.
She s saying that consumers dedicated Zen will be in store in 2016, so there will be no delay in respect of what has been announced last year..
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
If you check the geolocation of IP 80.198.72.114 (e.g. using this: https://www.iplocation.net/ ), it says "Denmark". Now what usual suspect could that be...? ;)

I was aware of the PCWorld article before, there's no quote from Intel there either so I'm not sure where they got that from.

Kirk Skaugen did say this:

Today is an exciting day, not only for Intel but the technology industry at large, as we introduce the 6th Gen Intel® Core™ processor family. It is – hands down – Intel’s best processor ever, and with the near-simultaneous debut of Windows 10 and exciting new hardware, we are entering a new era of computing that will bring unprecedented experiences to consumers and business users.

Which is true, of course any new processor should be better than its predecessors.

At the end of the day I will consider any mention of this trolling/flaming until someone clarifies who/when someone at Intel said it. Looks like their statement (if real) has been taken out of context.
 

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
According to the Wikipedia article history, some Anandtech forum member (with IP 80.198.72.114) has now went ahead and ninja-removed the content from the Wikipedia article today, see modification here:

https://en.wikipedia.org/w/index.ph...&type=revision&diff=701448015&oldid=700828626

The desperation of some IST members on this forum is astonishing!

Anyway, here is the original article (i.e. earlier revision):
https://en.wikipedia.org/w/index.ph...oarchitecture)&direction=prev&oldid=701448015

and one of the sources:
http://www.pcadvisor.co.uk/news/pc-...ips-to-appear-in-tablets-pcs-servers-3600550/
and another one, not from Wikipedia:
http://www.pcworld.com/article/2893112/intels-skylake-chips-to-appear-in-tablets-pcs-servers.html

If you check the geolocation of IP 80.198.72.114 (e.g. using this: https://www.iplocation.net/ ), it says "Denmark". Now what usual suspect could that be...? ;)

Why would somebody be so committed to project Intel's image on the internet while relentlessly attack AMD? What possible motivation is there?
 

Burpo

Diamond Member
Sep 10, 2013
4,223
473
126
According to the Wikipedia article history, some Anandtech forum member (with IP 80.198.72.114) has now went ahead and ninja-removed the content from the Wikipedia article today,

Maybe because intel never said that it was "their most significant processor". Some random Wiki contributer did that.

In fact, intel said.. "Introducing Our Best Processor ever"
http://www.intel.com/content/www/us...tnd8NHplvSQUgk-RIaa6bN4ZNP9TCQZw47xoCQvPw_wcB


I really don't see what any of this has to do with "AMD Q415 results"
 
Last edited:

dud

Diamond Member
Feb 18, 2001
7,635
73
91
I don't have much of a dog in any CPU fanboy fight but IHHO ANY bad results from AMD is bad news for all enthusiasts. Without competition innovation will be stifled. Just look at the lackluster performance improvement from Haswell to Skylake. If Intel had any credible competition they would have markedly improved their next generation CPU performance. Consider the possibility that you are being played ...


... or Mohr's law is kicking in.


Fanboys of any flavor (either AMD or Intel) make me laugh ...
 
Last edited:
Aug 11, 2008
10,451
642
126
Why would somebody be so committed to project Intel's image on the internet while relentlessly attack AMD? What possible motivation is there?

It works both ways. There certainly are plenty of posters on these forums who are as (or more so) determined to denigrate everything Intel does.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
I too would have liked to see more from skylake, especially on the desktop. But it seems a bit unfair to compare a chip with e-dram to one without.

Is it? We routinely compare chips without L3 cache (Kaveri) with chips that have L3 (Haswell/Skylake i3). We've also seen the lower-clocked 5960x beat the 4790k on certain game tests leading many to draw the conclusion that the difference in performance can be attributed to the size of the L3 cache.

Intel could be putting a larger L3 on their chips, or eDRAM, their choice. That would goose up performance. That have done neither for LGA1151.

5775c has edram. Doesn't this bode very well for Skylake with edram?

Sure! So where is it? Sadly, for LGA1151, the answer today is "nowhere".

I guess if one wants to find something to criticize, they can find an article to support it. The article basically is a complaint that intel does not include e-dram on skylake.

Yup! Basically. Intel has no real incentive (yet) to put eDRAM on all of their top-end chips for non-HEDT systems, and they want to keep L3 sizes smaller, too, to maintain product segmentation. They had to put in on the 5775C and 5675C so that Iris Pro wouldn't suffer. But doing so let the cat out of the bag, at least for those that were not paying much attention to the 4770R . . .

But what happened with BW-C? It was roundly criticized for being too expensive and the iris pro being useless because the cpu would almost always be used with a discrete card. So intel gets criticized either way.

Those criticizing the 5775C for the above-quoted reasons were not thinking clearly. 1). the Iris Pro iGPU could be useful for OpenCL 2.0/Vulkan/DX12-accelerated GPGPU, which is going to be a thing, eventually; 2). the price/availability of Broadwell-C was mostly due to the yields of the 14nm CPU die, NOT the eDRAM die that was fabbed using a high-yield, mature 22nm process; 3). LGA1150 CPUs all have iGPUs anyway, why not get one that has something useful for everyone (l4 eDRAM)?

And the author of that article makes it sound like a huge performance loss without e-dram, while it was what, 5 to 10% in some apps and games. (Hard to tell really because there was no BW without e-dram for direct comparison).

Right, but it's doing so with a base clockspeed of only 3.3 GHz! It only turbos to 3.7 GHz. It's like . . . why is this CPU even in the same ballpark?

And I think it has turned out that Skylake with fast ram is at least as fast or faster than BW-E in pretty much every game.

How fast does the RAM have to be to make that happen? And what happens when you actually OC the 5775C?

Not to mention, I think desktop skylake will eventually be available with e-dram.

That is what everyone ought to want from Intel. But again, the only thing they're competing with right now is themselves, so . . . they're gonna stall on it, and it's all over their roadmaps.
 

videogames101

Diamond Member
Aug 24, 2005
6,777
19
81
I don't have much of a dog in any CPU fanboy fight but IHHO ANY bad results from AMD is bad news for all enthusiasts. Without competition innovation will be stifled. Just look at the lackluster performance improvement from Haswell to Skylake. If Intel had any credible competition they would have markedly improved their next generation CPU performance. Consider the possibility that you are being played ...


... or Mohr's law is kicking in.


Fanboys of any flavor (either AMD or Intel) make me laugh ...

You're a moron.

If any new significant research had come along in the CPU architecture space they would have implemented it. Do you think the guys who designed Haswell left anything on the table? Of course not, so how do you expect Skylake to be much better given how little the CPU architecture science has actually changed in the last 5 years? Sure they could target different TDP's and core counts, but I'm assuming you're speaking about single core perf.

insulting other members is not allowed
Markfw900
 
Last edited by a moderator:
Mar 10, 2006
11,715
2,012
126
You're a moron.

If any new significant research had come along in the CPU architecture space they would have implemented it. Do you think the guys who designed Haswell left anything on the table? Of course not, so how do you expect Skylake to be much better given how little the science CPU architecture has actually changed in the last 5 years? Sure they could target different TDP's and core counts, but I'm assuming you're speaking about single core perf.

You are a CPU designer yes? It's good to see people with actual knowledge of the industry shoot down the FUD that Intel is just "dribbling out" improvements and not doing the best they can.

:thumbsup:
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
Oops, I forgot to answer these comments.
Interesting comment. How do you see AMD stock now btw? I remember you talked about volume and buyers? :)
Besides the interesting trading patterns (med term I suppose), the stock is as much influenced by general market and world news. As China's market - also playing an important role for AMD - doesn't grow as much as many hoped (for their money) before being brought back to the realism of this world, some might have jumped faster off than of bigger investments ("too big to fail"? ;)).

As I say, "safety" of stock price movements only counts for hours or days. Longer exposure increases the risk of the next news coming in.

Why sell cheap desktop dies when you can go all servers?
Servers first. There are different requirements for availability, than for desktop chips. So if servers are served, HEDT could receive some more dies.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
If any new significant research had come along in the CPU architecture space they would have implemented it. Do you think the guys who designed Haswell left anything on the table? Of course not, so how do you expect Skylake to be much better given how little the CPU architecture science has actually changed in the last 5 years? Sure they could target different TDP's and core counts, but I'm assuming you're speaking about single core perf.
If you check the uarch research of the past years, there are a lot of new ideas, improving on all fronts. But these come with significant changes in the uarch (from FU local changes up to a change of the whole design), new documentation, requirements, new test and verification scripts, etc. All that what's necessary to bring a research idea into a product.

With a complex design with a long evolutionary path (since P6) at hand, this is not that easy. See the MorphCore discussion. Since Merom Intel even didn't change the basic core floor plan much.
 

zentan

Member
Jan 23, 2015
177
5
36
Those criticizing the 5775C for the above-quoted reasons were not thinking clearly. 1). the Iris Pro iGPU could be useful for OpenCL 2.0/Vulkan/DX12-accelerated GPGPU, which is going to be a thing, eventually; 2). the price/availability of Broadwell-C was mostly due to the yields of the 14nm CPU die, NOT the eDRAM die that was fabbed using a high-yield, mature 22nm process; 3). LGA1150 CPUs all have iGPUs anyway, why not get one that has something useful for everyone (l4 eDRAM)?


Right, but it's doing so with a base clockspeed of only 3.3 GHz! It only turbos to 3.7 GHz. It's like . . . why is this CPU even in the same ballpark?

Why does Broadwell-C doesn't OC that high? Doesn't edram have some effect to that package? You will certainly see BD-E parts OCing fine.
I7 6700k has base clock close to avg. high OCs of 5775C i.e. 4.2GHz and i7 6700k on average does 4.7Ghz or higher,many even reached 5GHz w/o delidding. Do you think that would have been possible with the edram on package. So either way people criticize.Anyway,more people would have liked higher clocked flagship i7 than an edram based i7 which can't clock that high.Imagine the outrage if i7 6700k won't go past ~4.2+ easily.
Plus intel would be releasing an edram based i7 on new arch anyway.So what's the issue here?
Wouldn't providing an edram based skylake i7 equivalent of broadwell-c at the same release time as i7 6700k make the whole broadwell-c pointless?

And NO,i5 5775c doesn't beat or come close to i7 6700k always.Yes,in some cases it does beat or come close it,in some compression and other workloads.Of course edram helps for gaming latency and makes it look really good in some cases but still i7 6700k takes the lead in most cases.
Eventually edram might become more common say 10nm onwards,than it is today.

Again Vulkan/DX12 etc. will take time to materialize integrated graphics front and by then Intel and AMD would have got better integrated graphics likely with more work on integration perhaps.But that's likely some more years away to be more significant. Let's not forget HSA hasn't seen great momentum for the amount of time it was hyped. Intel's major focus has been making their GPU more than usable and handling more codecs efficiently,quick sync etc, better which they were able to do quite well.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Why does Broadwell-C doesn't OC that high? Doesn't edram have some effect to that package?

Fugger did a good review of the i7-5775C over on XS:

http://www.xtremesystems.org/forums/showthread.php?291735-Broadwell-5775C-on-Phase

Not everyone cares about OC on phase, but if you pick through the thread, you can find lots of data on how well the thing performed with a mere AiO cooler. That was also some time ago . . . support for the chip is certainly better now, and Intel might have some better retail samples in the wild now that they've had a lot of time to work on getting yields of 14nm up a bit.

Anyway, that should tell you what you need to know about Broadwell-C overclocking.

Plus intel would be releasing an edram based i7 on new arch anyway.So what's the issue here?

The issue is that they didn't do it out of the gate.

Wouldn't providing an edram based skylake i7 equivalent of broadwell-c at the same release time as i7 6700k make the whole broadwell-c pointless?

Most people regard broadwell-c to be pointless anyway, and given its poor availability, it is mostly a non-factor in the purchasing decisions of buyers.

And NO,i5 5775c doesn't beat or come close to i7 6700k always.Yes,in some cases it does beat or come close it,in some compression and other workloads.

http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed

As you can see from techreport's findings, the 5775C still hangs close to the 6700k in many workloads. The uarch improvements in the 6700k and the higher clockspeeds should have let the 6700k walk all over the 5775c, but it doesn't. A chip that has a 3.3 Ghz base clock has no business hanging that close to the 6700k. It's weird.

Again Vulkan/DX12 etc. will take time to materialize integrated graphics front and by then Intel and AMD would have got better integrated graphics likely with more work on integration perhaps.

The only really "fast" iGPUs Intel has right now are its eDRAM-enhanced parts. So once DX12 shows up and starts throwing compute functions at the iGPU, the parts that should take home the prize are the fastest ones . . . and for Intel, right now, that's the 5775c. Though I must admit, I was a little confused by some of the Luxmark performance on the 5775c.

Let's not forget HSA hasn't seen great momentum for the amount of time it was hyped.

Do not expect DX12 or Vulkan to have that problem.
 

dud

Diamond Member
Feb 18, 2001
7,635
73
91
You're a moron.

If any new significant research had come along in the CPU architecture space they would have implemented it. Do you think the guys who designed Haswell left anything on the table? Of course not, so how do you expect Skylake to be much better given how little the CPU architecture science has actually changed in the last 5 years? Sure they could target different TDP's and core counts, but I'm assuming you're speaking about single core perf.

insulting other members is not allowed
Markfw900



Wow, just wow! I appreciate your "passion" but ... wow!

Have you ever considered taking things a bit less seriously? You (hopefully) have a long life ahead. Don't burn out too quickly ...
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I don't have much of a dog in any CPU fanboy fight but IHHO ANY bad results from AMD is bad news for all enthusiasts. Without competition innovation will be stifled. Just look at the lackluster performance improvement from Haswell to Skylake. If Intel had any credible competition they would have markedly improved their next generation CPU performance. Consider the possibility that you are being played ...


... or Mohr's law is kicking in.


Fanboys of any flavor (either AMD or Intel) make me laugh ...

Couldn't agree more :thumbsup:
 

zentan

Member
Jan 23, 2015
177
5
36
Fugger did a good review of the i7-5775C over on XS:

http://www.xtremesystems.org/forums/showthread.php?291735-Broadwell-5775C-on-Phase

Not everyone cares about OC on phase, but if you pick through the thread, you can find lots of data on how well the thing performed with a mere AiO cooler. That was also some time ago . . . support for the chip is certainly better now, and Intel might have some better retail samples in the wild now that they've had a lot of time to work on getting yields of 14nm up a bit.

Anyway, that should tell you what you need to know about Broadwell-C overclocking.



The issue is that they didn't do it out of the gate.



Most people regard broadwell-c to be pointless anyway, and given its poor availability, it is mostly a non-factor in the purchasing decisions of buyers.



http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed

As you can see from techreport's findings, the 5775C still hangs close to the 6700k in many workloads. The uarch improvements in the 6700k and the higher clockspeeds should have let the 6700k walk all over the 5775c, but it doesn't. A chip that has a 3.3 Ghz base clock has no business hanging that close to the 6700k. It's weird.



The only really "fast" iGPUs Intel has right now are its eDRAM-enhanced parts. So once DX12 shows up and starts throwing compute functions at the iGPU, the parts that should take home the prize are the fastest ones . . . and for Intel, right now, that's the 5775c. Though I must admit, I was a little confused by some of the Luxmark performance on the 5775c.



Do not expect DX12 or Vulkan to have that problem.

Yeah,had seen those tests when it came out.Not much of a crowd for extreme-Ocing. Doesn't negate that edram didn't have a effect on Ocing as a package.
We only need to compare how much an i7 6700k does on phase change cooling. There's not enough data that convinces i7 skylake with edram would have done the high OCs that 6700k does.Avg. is what should be considered.We don't hear i7 5775c being 4.8-5Ghz stable for daily usage generally speaking.

There are enough benches where i7 6700k pulls ahead of 5775c,due to clocks difference and a bit higher ipc,but yeah there will be some where l4 does the trick. Nothing new in that. For Cinebench,POV ray and many others there's a good amount in favor of 6700k.And not so much in some others.If one looks closely there are various benchmarks particulalry single threaded and where cache doesn't matter much,you will see an i3 6100 fairing as good as i7 5775c.One needs to look at all sorts of benchmarks.

And again the question arises why would intel have i7 6700k and a i7 skylake with edram both at the same time.Base clock for intel generally isn't that much of a thing,their sustained turbos at least for desktop models for different core loads are where it matters.
Edram with a more powerful GPU alongwith the clocks of i7 6700k would have placed it in a different TDP label than it is now.Also it doesn't take a lot to imagine if for all these they would have had compromised on clocks on flagship it might even had performed below i7 4790k on many benches where L4 doesn't make any significant difference and then you know it could be a disaster for pr. ;)

I do believe that edram would become more common in future may be with more SKUs having it,but may be not so much now.Edram power consumption is going down with generations and will go down more in future. :thumbsup:

You simply can't have all sides of a thing.People will complain anyway. Intel have struggled with 14nm chips availability but from their pov it really doesn't make any sense to have DT flagship edram versions of both Broadwell and Skylake launching within 1-2 months of each other.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,783
254
126
I was aware of the PCWorld article before, there's no quote from Intel there either so I'm not sure where they got that from.
Yes, I agree it is a bit unclear where they got it from. But multiple reliable sources have reported the same thing, so it seems strange that they would all just make up the same statement.

And anyway, several people on this forum thought the statement was totally fair and reasonable anyway, so then it should not matter to them exactly what Intel has said when they argue their point of view.
Kirk Skaugen did say this:

Which is true, of course any new processor should be better than its predecessors.
Intel says lots of things. Just because they have said that does not mean they haven't said what was referenced in the Wikipedia article and the other sources too.
At the end of the day I will consider any mention of this trolling/flaming until someone clarifies who/when someone at Intel said it. Looks like their statement (if real) has been taken out of context.
As has been proven there are several reliable sources who have reported it. So until someone prooves those sources are incorrect, I will consider anyone denying that statement to be trolling/flaming.
 
Last edited:

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de