Intel and AMD say by 2015 We will have 128 core CPU's

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Oct 24, 2005
74
0
0
Originally posted by: Viditor
My own predictions FWIW...

1. It'll be far more than 128 cores, more like 1024 cores
2. Clockspeed will increase exponentially (THz?)
3. The size of the chip will decrease drasticly (10% of current sizes?)
4. It will be manufactured using carbon nanotubes
5. AMD and Intel will be far behind IBM in the process (IBM is the current leader by far in nanogates)
6. Anand will succeed in his run for the Senate...:)

It's the year 2005! Why can't I teleport?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Monkey muppet
Originally posted by: modempower

Originally posted by: govtcheez75
Originally posted by: Viditor
My own predictions FWIW...

1. It'll be far more than 128 cores, more like 1024 cores
2. Clockspeed will increase exponentially (THz?)
3. The size of the chip will decrease drasticly (10% of current sizes?)
4. It will be manufactured using carbon nanotubes
5. AMD and Intel will be far behind IBM in the process (IBM is the current leader by far in nanogates)
6. Anand will succeed in his run for the Senate...:)

what about the flying cars?



What about a computer core that TURNS INTO a flying car?!?!?!?!


A Flying car you ask?

Accelerates faster than a Ferarri too.
 

BlingBlingArsch

Golden Member
May 10, 2005
1,249
0
0
honestly i dont understand that.
i was just sayin that n7 shouldnt behave paranoid and start cryin whenever someone else is tryin to provoke him. some ppl in these forums react on every lil point of criticizm/fun/insult and thus so many threads end in boring blablabla..
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
Originally posted by: Markbnj
[The problem is that many tasks are single-threaded tasks. Those tasks that are parallel, eventually run out of work for multiple cores to do. In the supercomputing world, the solution is simple: devise a bigger problem to solve. That solution does not work in the real world for most people on PCs.]

What do you mean by this? The workload is not a batch. It is an interactive event-driven workload. If you follow this logic then you could argue that no automobile needs more than 45 horsepower, because that is enough to get it to freeway speeds. The point is not whether there is always work to do, but how responsive the system is when there is more instantaneous demand than one CPU can handle. This happens constantly under Windows or any other time-slicing operating system. Don't confuse this with percent processor utilization. Even when the processor is utilized at low levels it can still only run one set of instructions at a time. With a single core no two apps can have processor cycles at the same time. With more than one core, they can. That means a more responsive system.

I have well over 100 threads running on my system now. Let's say I have 100 of them. If I have 100 cores, then that system will be as responsive to load as it can possibly be at the rated speed of the processor. One less core will make it less responsive, as two threads will contend for processor cycles at some point. Yes, that difference will be very small in this theoretical case, but the difference between one processor and two is not small at all. On average it means half the amount of thread contention as in a single core system, and half the context switches.

Context switches are very expensive in terms of processor cycles. All the registers have to be swapped, along with any thread-local memory mappings. If the context switch is between two processes, then memory mappings always have to be swapped.

All this arguing that multiple processors don't benefit an ordinary Windows user verges on silly. In general, the average Windows user will have a smoother computing experience with two 1 ghz processors as opposed to one 2 ghz processor. For games this wouldn't hold, but when the speed differential narrows to a few hundred mhz. dual cores catch up fast, and provide better response.


The argument you are presenting is called "The Army of Ants vs. the Herd of Elephants". Said another way, is it better to have more of a slower number of processors or fewer faster processors. An example of the first is the Connection Machine series (CM-2, CM-5) which had 65,000+ 1-bit CPUs. An example of the second approach is the Cray series of supercomputers (T3D, T3E, XT3, etc.). Cray is still around; Connection Machine is not.

That being said, most cars do not have more than 300HP and speeds of even the top cars has leveled off. Why? Because features matter more than HP. My argument is that there will be no need for 128 CPU cores on a chip. Instead, more features will be added to the CPU (specialized circuits for HD decoding, etc.) just like cars now have air bags, cup holders, etc.

Will we go dual-core on the desktop? Yes. Quad-core? Maybe. 128-core? Nope.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Originally posted by: Mik3y
Originally posted by: modempower

Originally posted by: govtcheez75
Originally posted by: Viditor
My own predictions FWIW...

1. It'll be far more than 128 cores, more like 1024 cores
2. Clockspeed will increase exponentially (THz?)
3. The size of the chip will decrease drasticly (10% of current sizes?)
4. It will be manufactured using carbon nanotubes
5. AMD and Intel will be far behind IBM in the process (IBM is the current leader by far in nanogates)
6. Anand will succeed in his run for the Senate...:)

what about the flying cars?



What about a computer core that TURNS INTO a flying car?!?!?!?!

BINGO!

flying cars are ok...FLYING MONKEYS, now that would rock...!!!
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: ExarKun333

flying cars are ok...FLYING MONKEYS, now that would rock...!!!

Yes, but then this wouldn't be Kansas anymore, would it...?
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
[Will we go dual-core on the desktop? Yes. Quad-core? Maybe. 128-core? Nope. ]

I don't disagree with that, except I am not so pessimistic about quad core. I don't think we have 300 horsepower yet.
 

Gatt

Member
Mar 30, 2005
81
0
0
Multi core is going to be a case of diminishing returns faster than the mhz bumps were under today's circumstances. There's only so much you can thread out before the cost exceeds the benefits. At some point, the process will be "fast enough" and adding in another core won't be worth the expense. Even games, especially games, have a point where there's more cores than threads. If you could thread out core gameplay, physics, graphics, sound, and AI, you're talking 6 processors total counting one for the OS. Considering Graphics and Sound already have dedicated chips, 6 would allow you to do processor intensive stuff in the background as well.

The only way Multi-core is going to need to move beyond 6-8 cores is if we move to an age where there's a single computer running screens with inputs throughout the house. If a single computer is supplying an entire family's worth of needs, then we could see a need for more than 6-8 cores, but still that would likely top out around 10-20 cores.

While that is a very possible, even probable, sceneario it's still unlikely any user will need any more than 6-8 cores at any given time.

I'm guessing Quad-Core will be where it stops though.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Gatt
Multi core is going to be a case of diminishing returns faster than the mhz bumps were under today's circumstances. There's only so much you can thread out before the cost exceeds the benefits. At some point, the process will be "fast enough" and adding in another core won't be worth the expense. Even games, especially games, have a point where there's more cores than threads. If you could thread out core gameplay, physics, graphics, sound, and AI, you're talking 6 processors total counting one for the OS. Considering Graphics and Sound already have dedicated chips, 6 would allow you to do processor intensive stuff in the background as well.

The only way Multi-core is going to need to move beyond 6-8 cores is if we move to an age where there's a single computer running screens with inputs throughout the house. If a single computer is supplying an entire family's worth of needs, then we could see a need for more than 6-8 cores, but still that would likely top out around 10-20 cores.

While that is a very possible, even probable, sceneario it's still unlikely any user will need any more than 6-8 cores at any given time.

I'm guessing Quad-Core will be where it stops though.

That will depend on the OS...
I have a friend who is a software engineer, and he was telling me of a project he was doing (speculatively) that involved breaking the CPU up into 1000+ virtual cores, and running each thread on it's own core...
This was way over my head, but he assures me that Windows can be modified to accomplish this fairly easily and that the results would not only double processing efficiency, but enable almost a 100% secure environment.
I'm sorry that I have no details, but if there are any high-level people out there who understand this, please speak up!
 

Leper Messiah

Banned
Dec 13, 2004
7,973
8
0
Something like a vitrual machine for each process Viditor? Would be cool if you wanted to use Linux for websurfing, and windows for games, the extra cores could remove the performance impact from the emulation of different machines...
 

Markbnj

Elite Member <br>Moderator Emeritus
Moderator
Sep 16, 2005
15,682
14
81
www.markbetz.net
[Something like a vitrual machine for each process Viditor?]

We already have that, basically. In fact you could say Windows provides a virtual machine for each thread. That's not quite true but you can almost get there.
 

natto fire

Diamond Member
Jan 4, 2000
7,117
10
76
This thread is as stupid as those old videos from the 50s that said we would be living on the moon now...
 

CheesePoofs

Diamond Member
Dec 5, 2004
3,163
0
0
Originally posted by: 2kfire
[How fast do you really need your characters to show up on the screen when typing?!?
I want the characters to show up before I type them. So I can think about writing something and then it'll just be written.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Captain_Howdy
This thread is as stupid as those old videos from the 50s that said we would be living on the moon now...

LOL...true

They forgot to mention how they were going to manage the heat with these things. I suppose Intel could make a nuclear reactor out of their Prescotts.
 

undeclared

Senior member
Oct 24, 2005
498
0
86
I can understand both argumnts people are giving here..

This is my take:

We will increase cores for a while. We mgiht even get as far as 8, 16, or even 32 cores in one.

At this point, all the cores will probably be for intel, 3.4 extreme cores, and for amd, athlon 64 fx-57 cores..

We will want new features. I think we might change the way the cores work at this point..

For example:
core 1: video related
core 2: compression related
core 3: sound related
core 4-8: etc..

I think we'll be able to have computers that work very specialized and very strong in all areas... then it will be about 2x video, 2x compression, etc.. based on our needs.

That's what I think will happen.
 

Agnostos Insania

Golden Member
Oct 29, 2005
1,207
0
0
In 20 years, I predict computers will be twice as powerful, one-hundred times as large, and only the five richest kings of Europe will be able to afford them.
 

MNOB07

Member
Aug 23, 2005
43
0
0
in ten years:

ultraviolet disks - UVD or UV-ray
solid state drives replace current day harddrives
arguments arise about how many pixels the eye can detect
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: Viditor
I have a friend who is a software engineer, and he was telling me of a project he was doing (speculatively) that involved breaking the CPU up into 1000+ virtual cores, and running each thread on it's own core...
This was way over my head, but he assures me that Windows can be modified to accomplish this fairly easily and that the results would not only double processing efficiency, but enable almost a 100% secure environment.

The immediate reaction is that it's rubbish.
As Markbnj stated, Windows, just like any pemt-capable OS, already do all that.
It's not called 'virtual cores' though. It's called sheduling and context shifts.

Also, there's really not much point in running each thread on it's own core. Why, would one want to do that? It kinda hints at some fundamental misunderstandings about what threads are and about how a processor works.
There is also the question about whether he really said what you understood him to say. And to what degree he put the things in inadequate terms in order to paint a picture for you.

But let's give it a benefit of doubt and assume you understood him correctly, and that your friend works for Microsoft and has access to Windows source code. If he doesn't, you kinda have my answer already.

What does 'virtual cores' mean? Essentially, it should be to have a Windows version that supports 1000 CPUs (What would that be, BTW?), and to fool the sheduler to release 1000 threads simultaneously to something else than the processor/processors. That "something else" is also in itself essentially a sheduler too. And, that "something else" would also need to run on a thread.
Now what's the point in doing that?
What he implies is that he has an idea for a better sheduler than Windows' current.
So why doesn't he say that instead?
Why not just replace the Windows sheduler?

"double processing efficiency"? Where is this processing inefficiency that he intends to take a bite of?
Well, if you run 1000 threads, you might want to reduce quanta length, and then you might want to do something about the context shifts.
If your friend's claims are valid, I would guess he is intending to somehow reduce the amount of housekeeping in some 'replacement context shift'. That's it.

But that isn't something that is easy to achieve.
You can do it with a completely different architecture. One that separates code from data and doesn't allow non-reentrant code. But that's not Windows, ...or any *nix.
 

Vee

Senior member
Jun 18, 2004
689
0
0
I really like to make a couple of contributions in this thread.

About 1990, I participated in a debate. Before going, I reinforced myself with what process engineers from both Motorola and Intel said. They said about the same thing. - Sure, we'll be at 500 MHz and over 100 million transistors within some years.
At the time that would seem outrageous to most people, as we then were on 33-50MHz and had just passed one million transistors.

Now, I haven't seen this "128-cores" thing. (Does anyone have any link to that?)
I'm inclined to flat out disbelieve 128 cores by 2015. 16 to 32 cores around 2015 seems more right. Then we'd have 128 cores maybe by 2025.

But this "128 cores" prediction stirs that memory.
They weren't completely right, as 500MHz did not coincide with 100 million transistors.
MHz marched faster. But that's a minor thing. Today we're at 2.4GHz and 210 mill with dualcore A64's and 3.8GHz and 170 mill with 600 series P4.
What they really said that time was that they didn't see any major obstacle on the road ahead.

We might not get 128 cores. But we'll get something, that you will find hard to imagine now.

*********

"No use for" some says.
Nothing could be more wrong.
The thing is that what we use computers for, is decided by what they can do!
That is how it has always been, and the "no use for" argument dates back all the way to the stoneage. It's never been right sofar.

Just imagine this for a teaser:
A scene, or a sport arena, is captured by a number of advanced synchronized hires cameras and microphones, from different positions and angles. The pictures are analyzed and a complex 3D-scenery data is computed.

The 3D-scenery is then fed into your PC, which is able to do real time rendering of the scene. You can move around in the scene, just like in a FPS-game, watching and hearing it from any angle or distance. It will be like you yourself controlled your own camera.
The PC is not playing a recording, it is computing/creating and rendering a scenery in real time.

There is already the genesis for this technology, used for some types of sports coverage.

But I think we on a consumer level will first see this in the successor to the movie. A 'walk about in' 3D experience rendered on the fly.

...And that's only one thing.

The only reason why this (and other things) wouldn't happen, would be if the market competition for some reason fails to drive this development.

For instance, if AMD goes down. Intel would then sharply reduce progress on mainstream x86 processors. They would do that in order to create a greater differential between high and low level. This would be accompanied with an even greater differential in costs.
That would allow them to squeeze more money from enterprise, business, server and workstation sectors. I think IBM and Power would be fine with that and go along nicely enough.

PC publications would feature articles like "what happened to Moore's law?". Where they ,no doubt, would conclude that "PC performance has risen to a level where no additional performance is really needed". And this coupled with "having finally reached substantial technical obstacles". Thus the market "now drives other features than raw performance".

Those conclusions would seem valid and reasonable to most people, but would nevertheless be extremely-totally wrong.

P.S. There is another complication. XBox and Playstation. As long as there are two players, we would see continued progress. And at some point, internet-ed game consoles and hdtv will challenge the PC. Unless PC hardware, software and OS continues moving. Which is what I'd expect, but if AMD dies...
And if also MS tires or Sony folds, computing will become pretty stuck.

 

kamper

Diamond Member
Mar 18, 2003
5,513
0
0
Originally posted by: Markbnj
[The problem is that many tasks are single-threaded tasks. Those tasks that are parallel, eventually run out of work for multiple cores to do. In the supercomputing world, the solution is simple: devise a bigger problem to solve. That solution does not work in the real world for most people on PCs.]

What do you mean by this? The workload is not a batch. It is an interactive event-driven workload. If you follow this logic then you could argue that no automobile needs more than 45 horsepower, because that is enough to get it to freeway speeds. The point is not whether there is always work to do, but how responsive the system is when there is more instantaneous demand than one CPU can handle. This happens constantly under Windows or any other time-slicing operating system. Don't confuse this with percent processor utilization. Even when the processor is utilized at low levels it can still only run one set of instructions at a time. With a single core no two apps can have processor cycles at the same time. With more than one core, they can. That means a more responsive system.

I have well over 100 threads running on my system now. Let's say I have 100 of them. If I have 100 cores, then that system will be as responsive to load as it can possibly be at the rated speed of the processor. One less core will make it less responsive, as two threads will contend for processor cycles at some point. Yes, that difference will be very small in this theoretical case, but the difference between one processor and two is not small at all. On average it means half the amount of thread contention as in a single core system, and half the context switches.

Context switches are very expensive in terms of processor cycles. All the registers have to be swapped, along with any thread-local memory mappings. If the context switch is between two processes, then memory mappings always have to be swapped.

All this arguing that multiple processors don't benefit an ordinary Windows user verges on silly. In general, the average Windows user will have a smoother computing experience with two 1 ghz processors as opposed to one 2 ghz processor. For games this wouldn't hold, but when the speed differential narrows to a few hundred mhz. dual cores catch up fast, and provide better response.
This is incorrect. The whole point of timeslicing is that it turns an interactive environment into a batch of jobs. On a system not under heavy load, context switching isn't that big a deal but scheduling on 100 cores using today's algorithms would be. Granted, once you get that many cores your scheduling algorithm will change and would involve more core-pinning but we're not there right now.

The big difference is when you have one thread that wants to do a lot of work, or at least a lot more than all the rest. If you were to take your current single core and split it into 100 cores each with 1/100th the power of the original you could only work as fast as the single core if no single thread needed more than 1/100th of the total available power. That's pretty much guaranteed to happen.

In general the difficulty of multi-core scheduling for a large number of threads is going to outweigh the overhead of context switches so fewer more powerful cores are better, especially in the case of when a single thread needs full power. Multi-core is also nasty when you have two threads touching the same data because you have to synchronize between two caches. Single core is safe here because you never have to worry about multiple copies of a variable. I imagine multi-core is only seriously beneficial when you have about the same number of cores as threads doing heavy processing so you can avoid context switches when possible. Otherwise you've got to worry about inter-core context switches as well, which means flushing a whole cache instead of just registers. Going with too many cores limits the amount of processing some threads can do while wasting power on other ones, unless all your cores are running at the maximum possible power of any single core, but that's not a fair comparison :p
 

Madcadden

Junior Member
May 17, 2015
1
0
66
Well it's now 2015 and there's still no 128 core CPU's from AMD or Intel.

I'm sitting here laughing as i read back over some of the posts made here!
 
Last edited: