AMD to Cut 5% of Workforce

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It doesn't matter what is impressive or not. What matters is what is possible and what does the market want.

Its quite clear you have no experience at all in any matter. But only goes on with the regular BS.

I assume that because your posts are nonsensical and show a distinct lack of experience.

Yep, hot air and nonsense like always with him. But its also hard otherwise with the agenda of his.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,195
580
126
It doesn't matter what is impressive or not. What matters is what is possible and what does the market want.

Matters in what context?

And do you consider the desktop ST performance improvements the last few years to be impressive or not? You wrote:

"Why do you think ST performance haven't moved much? Even with billions thrown at it."

Is there any other way to interpret that than that you consider it to be impressive? Or perhaps it's just your usual economic based view that "Intel has thrown $X billion on R&D, so the improvements have to be impressive regardless of actual outcome!".
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Matters in what context?

And do you consider the desktop ST performance improvements the last few years to be impressive or not? You wrote:

"Why do you think ST performance haven't moved much? Even with billions thrown at it."

Is there any other way to interpret that than that you consider it to be impressive? Or perhaps it's just your usual economic based view that "Intel has thrown $X billion on R&D, so the improvements have to be impressive regardless of actual outcome!".

The issue seems to be your complete lack of realism.

intel-xeon-ipc-chart.jpg


If IPC increase was so easy, AMD wouldn't be where they are today.
 
Last edited:

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
There is a famous article within the software industry from Dr Dobbs, from 2005.

It's famous because it rightfully predicted the decline of single core IPC increases and the rising need to develop multi-threaded / muti-process software to continue to increase performance.

This should be pretty well known to any experienced software developer...

The Free Lunch Is Over
A Fundamental Turn Toward Concurrency in Software

http://www.gotw.ca/publications/concurrency-ddj.htm
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The issue seems to be your complete lack of realism.

intel-xeon-ipc-chart.jpg


If IPC increase was so easy, AMD wouldn't be where they are today.

You would think somebody with a MS in Computer Science would know this.

Do the online paper mills offer CS degrees?
 

jpiniero

Lifer
Oct 1, 2010
16,780
7,233
136
Actually I have a suspicion that Intel nerfed Skylake to mostly get it down to something that could sorta be sold as a tablet processor but also because the power increase wasn't worth it.
 
Aug 11, 2008
10,451
642
126
You would think somebody with a MS in Computer Science would know this.

Do the online paper mills offer CS degrees?

Well, that chart only tells part of the story. What it doesn't tell is that up until Sandy Bridge, ipc increases were on top of steadily increasing clockspeed, especially with overclocking. Since Sandy Bridge, maximum clockspeed has actually regressed, partially eating up the ipc increases. This means actual per core total performance (clockspeed x ipc) is going up at a much slower rate than previously.

Of course, if one strategy (increasing clockspeed and ipc) is no longer working, there is an obvious alternative strategy, which is to increase core count. Too bad intel is to stubborn and concerned about igp and power usage to do the obvious.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,583
10,224
126
Of course, if one strategy (increasing clockspeed and ipc) is no longer working, there is an obvious alternative strategy, which is to increase core count. Too bad intel is to stubborn and concerned about igp and power usage to do the obvious.
Whatever happened to the "only the paranoid survive" Andy Grove-esque Intel that I used to know? If they were still around, we would be on 10-core mainstream CPUs by now. (Edit: And Intel wouldn't be losing $4B to contra-revenue either.)
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Yes, please give us less performance in smaller form factor. The tech companies should all vigorously strive to reduce performance and productivity. That should be the number one goal that they devote R&D to. Good times ahead.

And yet you're one of the FEW people complaining. Everyone else is very excited when these devices come out.

So it shows you're in the minority of people Intel should cater to. What a surprise....
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Whatever happened to the "only the paranoid survive" Andy Grove-esque Intel that I used to know? If they were still around, we would be on 10-core mainstream CPUs by now. (Edit: And Intel wouldn't be losing $4B to contra-revenue either.)

Intel has 98% of the x86 market. There's no growth there, the growth is in low power devices. I don't get why people can't seem to comprehend this.
 
Mar 10, 2006
11,715
2,012
126
Whatever happened to the "only the paranoid survive" Andy Grove-esque Intel that I used to know? If they were still around, we would be on 10-core mainstream CPUs by now. (Edit: And Intel wouldn't be losing $4B to contra-revenue either.)

What does mainstream core counts have to do with "paranoia"?

And Intel isn't losing $4B in contra-revenue.
 

Spungo

Diamond Member
Jul 22, 2012
3,217
2
81
Chicken and egg. Why develop SW requiring more performance when that performance is not available anyway, since desktop performance improvement is so crappy? Nobody would use that SW anyway, since their PCs would be too slow for it.

There's definitely a problem of diminishing returns. I still remember the first time I saw Quake. I couldn't believe how awesome it looked, and one needed a Pentium to run that game. I also remember the first time I saw GL Quake. A rocket would illuminate bits of the hallway, and it would cast shadows on some things. It was the most amazing thing I had ever seen. When improvement like that comes along, people are willing to pay a lot of money for it. Modern graphics are so good that nothing really seems that impressive. I bought Call of Duty 4 a few weeks ago, which is a game from 2007, and I think it looks great. That game is 8 years old. I don't think I will ever be as impressed as I was when Quake came out. Nothing will force me to drop everything and immediately buy the most expensive processor on the market. I'm still using a Phenom II from 2010. Nothing has come along to justify buying a faster CPU.

It seems like we've hit the limit on speed if we can get away with using 5 year old computers to play the latest games, so the current focus is on power consumption and size. People won't buy a computer just because it's faster, but they might buy a computer if it's smaller, quieter, and generates less heat. It's possible to buy an i7 that fits in a mini ITX case. That's not my cup of tea, but some people really like that idea.
 
Last edited:
Aug 11, 2008
10,451
642
126
And yet you're one of the FEW people complaining. Everyone else is very excited when these devices come out.

So it shows you're in the minority of people Intel should cater to. What a surprise....

Yes i can hardly contain my enthusiasm for buying a 1500 dollar convertible with the performance of my core 2 duo laptop from 2010.
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
There is a famous article within the software industry from Dr Dobbs, from 2005.

I like this part:

Myths and Realities: 2 x 3GHz < 6 GHz

So a dual-core CPU that combines two 3GHz cores practically offers 6GHz of processing power. Right?

Wrong. Even having two threads running on two physical processors doesn’t mean getting two times the performance. Similarly, most multi-threaded applications won’t run twice as fast on a dual-core box. They should run faster than on a single-core CPU; the performance gain just isn’t linear, that’s all.

Why not? First, there is coordination overhead between the cores to ensure cache coherency (a consistent view of cache, and of main memory) and to perform other handshaking. Today, a two- or four-processor machine isn’t really two or four times as fast as a single CPU even for multi-threaded applications. The problem remains essentially the same even when the CPUs in question sit on the same die.

Second, unless the two cores are running different processes, or different threads of a single process that are well-written to run independently and almost never wait for each other, they won’t be well utilized. (Despite this, I will speculate that today’s single-threaded applications as actually used in the field could actually see a performance boost for most users by going to a dual-core chip, not because the extra core is actually doing anything useful, but because it is running the adware and spyware that infest many users’ systems and are otherwise slowing down the single CPU that user has today. I leave it up to you to decide whether adding a CPU to run your spyware is the best solution to that problem.)

If you’re running a single-threaded application, then the application can only make use of one core. There should be some speedup as the operating system and the application can run on separate cores, but typically the OS isn’t going to be maxing out the CPU anyway so one of the cores will be mostly idle. (Again, the spyware can share the OS’s core most of the time.)
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,195
580
126
The issue seems to be your complete lack of realism.

intel-xeon-ipc-chart.jpg


If IPC increase was so easy, AMD wouldn't be where they are today.

Perf is not only decided by IPC, but frequency too (and core count for MT performance).

So the problem is that you're missing a similar graph with frequency increases. Now show us that, and how it compares to the frequency increases over the last 30 years.

Then you'll see how crappy the perf increase has been the last few years compared to that.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,195
580
126
You would think somebody with a MS in Computer Science would know this.

Do the online paper mills offer CS degrees?

It's missing the frequency part of the perf increases, and thus conclusions about total perf increase cannot be drawn from that alone. It's obvious to anyone who has the least amount of knowledge in the area. I can tell that you certainly don't.

Oh, and BTW, you never responded to this:
What relevant merits do you have yourself?
Neither did ShintaiDK respond to this:
"major enterprise class experience"? What does that mean? What SW engineering have you studied, and have you actually designed and written any SW?
Seems like both you guys are very busy requesting others to list their merits and background, yet fail to provide any such info about yourself. I think that says a lot about your credibility.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,195
580
126
And yet you're one of the FEW people complaining. Everyone else is very excited when these devices come out.

Really? o_O Some may be excited about some devices due to better screens and such, but not the CPU itself, which was what I was talking about.

I don't see many average people cheering with enthusiasm about Skylake. Not even enthusiasts. 5% yearly performance increase simply does not do it.

I think it's only the Intel marketing people (and their friends on this forum) that actually believe it when they say that Skylake is "The most important chip architecture in a decade". For everyone else that is a completely ridiculous statement.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
There is a famous article within the software industry from Dr Dobbs, from 2005.

It's famous because it rightfully predicted the decline of single core IPC increases and the rising need to develop multi-threaded / muti-process software to continue to increase performance.

This should be pretty well known to any experienced software developer...

The Free Lunch Is Over
A Fundamental Turn Toward Concurrency in Software

http://www.gotw.ca/publications/concurrency-ddj.htm

Exactly, he nailed it. Software have been the issue ever since.

But that wont stop foolish people to think that "moar cores" solves everything.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,195
580
126
Exactly, he nailed it. Software have been the issue ever since.

But that wont stop foolish people to think that "moar cores" solves everything.

I don't know why you're trying to turn this into an ST vs MT debate now all of a sudden? The original complaint was that performance improvements in general have been very slow in desktops lately. Not MT performance and core count specifically.

And who has said that more cores solves "everything"? Of course SW will have to be multi-threaded to make use of them, that's obvious. No need to throw a Dr Dobbs article from 2005 at us to prove that. Note that it was written when more than one core was very uncommon (Athlon X2 hit the market in May 2005), and multi-threaded SW was too for obvious reasons. Then there was a need to communicate the implications of having CPUs with more cores, which were just hitting the mainstream market at the time.

But ten years ahead, more and more SW is getting multi-threaded. On mobile especially it's already so:

What we see in the use-case analysis is that the amount of use-cases where an application is visibly limited due to single-threaded performance seems be very limited. In fact, a large amount of the analyzed scenarios our test-device with Cortex A57 cores would rarely need to ramp up to their full frequency beyond short bursts (Thermal throttling was not a factor in any of the tests). On the other hand, scenarios were we'd find 3-4 high load threads seem not to be that particularly hard to find, and actually appear to be an a pretty common occurence. For mobile, the choice seems to be obvious due to the power curve implications. In scenarios where we're not talking about having loads so small that it becomes not worthwhile to spend the energy to bring a secondary core out of its idle state, one could generalize that if one is able to spread the load over multiple CPUs, it will always preferable and more efficient to do so.
In addition, the more cores that are available in mainstream CPUs, the more benefit there is of making the SW multi-threaded. Hence chicken and egg again.

Finally, just wondering? Why are you using a 4C/8T CPU yourself? You do know there are 2C/2T CPUs that would suit you better (sorry, no 1 core mainstream Intel CPUs available anymore). Just get an Intel Pentium 2C/2T CPU and OC it to reach higher frequency. That would be more consistent with your convictions.
 
Last edited: