what cpu's will we be using in 10 years?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
and that the realm of CPUs will stay in raw compute power.
CPUs don't have raw computing power, GPUs do. CPUs have very very little computing power in comparison, but they can perform a single linear task very very fast (branch prediction, out of order, high mhz operation, etc).
Netburst did not increase the speed of number crunching, it slowed it down, it increased the frequency, aka mhz, aka how often the chip performs "an operation"
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
CPUs don't have raw computing power, GPUs do. CPUs have very very little computing power in comparison, but they can perform a single linear task very very fast (branch prediction, out of order, high mhz operation, etc).

That actually depends. In terms of things GPUs are good at, CPUs really lack in computing power. But you look at the single precision they become closer. If you go see double precision FP power, they become very very close. Double precision is the most important FP for PC apps, and is also used a lot in workstations.

Nvidia Geforce GTX 280
-933GFlops Single Precision
-117GFlops Double Precision

Core 2 Quad 3.0GHz
-96GFlops SP
-48GFlops DP

Core 2 Duo 3.0GHz
-3.0GHz x 8 SP/cycle x 2 cores = 48GFlops Single Precision
-24GFlops Double Precision

Theoretical Sandy Bridge at 3.4GHz x 8 core
-190.4GFlops DP


We know that FP performance doesn't determine everything in a CPU. Unlike GPUs, CPUs aren't limited by compute power on vast majority of the applications it runs. They are mostly limited by etc. memory parallelism among other things.

The way I think, GPUs are optimized for running "dumb" instructions fast. CPUs are optimized for running "smart" instructions. :)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yes, which is why i would say the GPU is the one with more "RAW power"... can do a lot of dumb instructions very fast... CPUs on the other hand have refined, focused, directed power... they can do amazing things too...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Dedicated hardware for addressing specific application needs will always result in superior "raw performance" versus using an all-purpose processor for the same task.

But raw performance does nothing to speak to performance/dollar or performance/watt.

The sram cache on my Kentsfield is superior to the dram chips sitting next to it on my mobo, but I'm not about to buy 8GB of 4GHz sram from Intel were they to start selling dedicated sram chips to replace ram.

What is special about today's GPU's is that while yes they are dedicated hardware with limited applications (albeit expanding) they aren't all that great for doing computations on anything but matrices of data. Not an all-purpose processor by any means.

If you want processors that are good at matrix mathematics then you should be looking at Niagara2 or Power6 or Itanium chips. The reason such comparisons never get made though is that most folks who would go to the trouble of making such comparisons already realize the price/performance argument kills the point of discussion, so why start it?

If you want performance you go big iron. You want price/performance go to the consumer desktop market and look at x86 quads and GTX280's. You'll do povray benching real good, then after those 5minutes of benching are done you'll look around your room and say "ok, now what do I do with all this teh uber hardware? mmmm, I guess I'll download F@H and be done with it".
 

Pederv

Golden Member
May 13, 2000
1,903
0
0
What processors will we be using in 10 years?

Since Intel will be the only company making computational logic chips.... an Atom III @ 3GHz.

Everything else will be out our price range.
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
on one of my dreams it came to me, I envisioned that I will be building a flux-time computer in 10 years that can perform any computation yet uses zero time. it accomplishes this amazing feat by running all operations in parallel time lines therefor creating the ultimate multitasking system (running Windows Temporal Edition of course). it will be running on a compacted fusion reaction chamber powered by a bottle of Coca softdrink. all for the bargain low price of 3000 Yen (well most people have abandoned dollar by that time due to the down slide of the American economy). And by executive decree from President Palin, all fusion chamber is required to be sold with a thermal nuclear detonation kit that can instantly convert any average household computer to a deadly weapon in case of Russian attacks. As President Palin proudly proclaims: "who needs a shot gun when such a potent weapon is made available to the average citizens of this country. And the enemies of democracy will be kept at bay for millenniums to come."
 

DanDaMan315

Golden Member
Oct 25, 2004
1,366
0
0
Single "processor" 216 core designs similar to what we are all seeing in the Graphics card industry. This seems to stay a lot cooler and the processor it self needs a program distributor integrated to distribute the tasks on the operating system.

Multi-layer pin designs too, the motherboard is running out of room for the processor. I see more and more things being done directly on the processor and computers getting smaller. This means for less motherboard space and therefor less room for the CPU. Hopefully the industry can put out CPU's that do not need active cooling at stock so we can all overclock joyfully again.
 

LS8

Golden Member
Jul 24, 2008
1,285
0
0
I predict the business class desktop PC will be extinct within 5 years. We are coming full circle with terminals, or as people like to call them today, thin clients. It's cheaper and easier to implement a monster terminal server and let everyone connect via terminals/thin clients. You still get a Windows or Linux environment, whichever you prefer.

I honestly don't see the home and mobile PC markets advancing too much in CPU power. If you consider the average user only browses the web, watches movies and checks email - all of these tasks can be done on much slower hardware than even today's high-end. I doubt you will see Quad-Core become common-place in the home outside of the performance consumer market (like folks on AT who build their PCs)

The mobile market is going to keep heading towards the low power, basic function ultra-portables like we have seen come out in the last few years. The home market is going to head into what I call "decade machine" where you buy a PC that is designed to server the user for up to ten years before being replaced.
 

Niku

Member
Aug 31, 2008
151
0
0
The human brain is the best computer conceived. Eventually, they will start harvesting human brains for use as CPUs, suspended in a protein slurry. GPUs will be segments of peoples lobes and sensory cortex'! Its will be like The Matrix in reverse. Code will be suspended in a collective human brain cluster/consciousness. Eventually, we will starting interfacing with the network like in the matrix until we just start permanently integrating our beings in to the mass. And some day, the world will just be covered in one huge super brain.

Good Times!
 

DanDaMan315

Golden Member
Oct 25, 2004
1,366
0
0
Originally posted by: LS8
I predict the business class desktop PC will be extinct within 5 years. We are coming full circle with terminals, or as people like to call them today, thin clients. It's cheaper and easier to implement a monster terminal server and let everyone connect via terminals/thin clients. You still get a Windows or Linux environment, whichever you prefer.

I honestly don't see the home and mobile PC markets advancing too much in CPU power. If you consider the average user only browses the web, watches movies and checks email - all of these tasks can be done on much slower hardware than even today's high-end. I doubt you will see Quad-Core become common-place in the home outside of the performance consumer market (like folks on AT who build their PCs)

The mobile market is going to keep heading towards the low power, basic function ultra-portables like we have seen come out in the last few years. The home market is going to head into what I call "decade machine" where you buy a PC that is designed to server the user for up to ten years before being replaced.

Business class computers will probably stay around. They should be built a little more soundly. Longer lasting hard drives and power supplies to keep the things running. Most business/office use computers don't need to be super fast as businesses are cheap, cheap, cheap.

***Multi-cores are supposedly lowering power consumption? Are business computers these days actually more efficient than a home PC? If you check out HP.com they doesn't seem to much of a difference except maybe a small discount which just comes out of the cost of components unless their is some sort of bulk discount. Most business computers are being used for one task right now and hopefully forever so multi-core is somewhat useless at this point? corporations shouldn't have workers that have Excel open with Default "I'm working" spreadsheets up while they are on eBay, Amazon, and Super Extreme Java Games.us
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: keysplayr2003
Originally posted by: myocardia
Originally posted by: nerp
So how fast will that load Photoshop? :)

Even Photoshop CS4, which was just released a day or two ago, doesn't benefit from quads. Maybe by then, though, a quad will be faster than a dual-core, in CS18. I'm not holding my breath, though.:D

"NVIDIA QUADRO CX - THE ACCELERATOR FOR ADOBE CREATIVE SUITE 4"

No CPU on the planet can compete with this right now for PS CS4.

I mention this because I just saw the demo for it. And you reminded me of it.

By the way. Isn't Larabee supposed to have 32 cores? Due out next year or in 2010?


Keys start another thread on this . Lets really get into it . If you could . Tell us everthing about Codec for this amazing accomplishment. Don't hold back because I won't.

 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Nemesis 1
Originally posted by: keysplayr2003
Originally posted by: myocardia
Originally posted by: nerp
So how fast will that load Photoshop? :)

Even Photoshop CS4, which was just released a day or two ago, doesn't benefit from quads. Maybe by then, though, a quad will be faster than a dual-core, in CS18. I'm not holding my breath, though.:D

"NVIDIA QUADRO CX - THE ACCELERATOR FOR ADOBE CREATIVE SUITE 4"

No CPU on the planet can compete with this right now for PS CS4.

I mention this because I just saw the demo for it. And you reminded me of it.

By the way. Isn't Larabee supposed to have 32 cores? Due out next year or in 2010?


Keys start another thread on this . Lets really get into it . If you could . Tell us everthing about Codec for this amazing accomplishment. Don't hold back because I won't.

I don't know why but it pleasures me immensely to see Nvidia get the hurt put on it. :evil:

Nvidia never bothered me none, and I always liked their products (I use ATI and Nvidia and Intel across all my systems) but for some reason I just can't stand the publicly displayed personal side of their CEO.

I guess that makes me biased against Nvidia (as verse to being biased in favor of a specific GPU maker) so I am a "Nvidia un-fanboi"?

The same can be said of my feelings for Hector Ruinz but oddly enough I've never been pleasured by seeing AMD get the hurt put on them. (as I suspect my dear friend Nemesis here does get pleasure from it ;))

But my I love my dual-screen Nview setup now just as much as I did 10yrs ago on my Ti3000 (or whatever dual-port vga card Nvidia was sporting at the time).

I guess its just a case of me liking to dislike the top-dog :D

(edit: fixed some truly embarrassing spelling errors :eek:)
 

lyssword

Diamond Member
Dec 15, 2005
5,630
25
91
Originally posted by: Niku
The human brain is the best computer conceived. Eventually, they will start harvesting human brains for use as CPUs, suspended in a protein slurry. GPUs will be segments of peoples lobes and sensory cortex'! Its will be like The Matrix in reverse. Code will be suspended in a collective human brain cluster/consciousness. Eventually, we will starting interfacing with the network like in the matrix until we just start permanently integrating our beings in to the mass. And some day, the world will just be covered in one huge super brain.

Good Times!

mmmm, brain /zombie
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Idontcare
The sram cache on my Kentsfield is superior to the dram chips sitting next to it on my mobo, but I'm not about to buy 8GB of 4GHz sram from Intel were they to start selling dedicated sram chips to replace ram.

Luckily, you won't have to buy 8GB of SRAM. In the next few years, it's looking like we'll be replacing SDRAM and hard drives with memristors. Sign me up for 128GB of the RAM replacement, and 10TB of the HD repacement.
 

Hulk

Diamond Member
Oct 9, 1999
5,118
3,661
136
This is an interesting question and very difficult to predict. My guess is that there will be a gradual shift in how CPU power is allocated.

Based on the thermal limitation of the silicon process technology I think clockspeeds will remain at the 4GHz level.

I think entry level chips will contain 16 cores, mid level 32 cores, and the xtreme processors will have 64 or possibly even 128 cores.

I also think these cores will be much more efficient than today's best cores and software will be much better optimized to deal with the multiple cores.

One serious move ahead I think will be the way the software and hardware dynamically interact depending CPU power required by the user at any given moment.

So when you are just browsing the web or doing word processing you may only be using 1 or 2 cores and they may even be throttled back quite a bit clockspeed-wise. Then you decide to do some gaming or video editing and during a deep preview (lots of fx) or render 20 or 30 cores might fire up, with your cooling system spinning up to meed the thermal demand. Non used cores will be basically shut off drawing no power. Process size will be small enough to fit all these cores onto chips the size we are seeing today.

That's my prediction.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
The human brain is the best computer conceived. Eventually, they will start harvesting human brains for use as CPUs, suspended in a protein slurry. GPUs will be segments of peoples lobes and sensory cortex'! Its will be like The Matrix in reverse. Code will be suspended in a collective human brain cluster/consciousness. Eventually, we will starting interfacing with the network like in the matrix until we just start permanently integrating our beings in to the mass. And some day, the world will just be covered in one huge super brain. Good Times!

The human brain has one noticeable advantage compared to a computer. Computers need seperate processing elements for the permanent storage(HDD/SDD etc.), temporary storage(RAM/Caches), and the actual processing part. Human brain has the storage part and the processing part unified.

It's like a CPU sitting on top of a HDD capacity memory that runs at speeds of on-die cache. The idea of stacked on-die DRAM is an evolution to the idea.

Based on the thermal limitation of the silicon process technology I think clockspeeds will remain at the 4GHz level.

I think entry level chips will contain 16 cores, mid level 32 cores, and the xtreme processors will have 64 or possibly even 128 cores.

I don't think we'll see differentiation based on core count alone. From my point of view there are three classes of programs(and everything in between):

1. Single threaded application(very hard to multi-thread)
2. Multi-threaded application with moderate(limited) scaling, uses elements of multi-threading performance and single threading similarly.
3. Heavily and easily multi-threaded application, like GPUs for graphics

In order to satisfy the demands of all three, there must be a different way. Big, fast clocked scalar cores are good for single thread. Multiple cores with lots of bandwidth and good scaling is good for multi-thread.

Answer: Hetereogenous cores. CPUs that contain small numbers of big fast cores and large numbers of small cores.

And Intel had a glimpse of that.

Spring IDF 2005, Platform 2015: http://www.anandtech.com/cpuch...howdoc.aspx?i=2368&p=2

I predict Haswell(the tock after Sandy Bridge) and the derivative/shrink will be the last, multiple large core CPU. From then on, we go onto the idea of a hetereogenous multi-core with small cores and large cores combined together.

But we gotta think of it in a software programmers way first. They won't transisition from moderately sized 8-10 cores one day to 1 huge core and 30 tiny cores the next day. The transition CPU is on the link. The CPU that will transition will have:

2-3 large cores
10 small cores

Large cores for software case #1, where the code is largely single threaded and extremely difficult to parallelize. 2-3 of that large cores for software case #2, where a good mix of single threaded performance and multi-threaded performance is needed. Software case #3 will work well with 10 small cores where the code is very parallelizable.

Inherently, there will be apps that will need ultimate single thread performance. With the mix of multi-threaded apps and the ever insatiable demand for single thread performance, only hetereogenous cores can offer both. Eventually, for most of the software case #2, I believe programmers will find a way to make them fit better for multi-thread so they can all run super fast on the many tiny cores.

 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: IntelUser2000
But we gotta think of it in a software programmers way first.

I can't help but hope that the answer will come from the GPU industry (both GPU and game programmers) as they already live in a world where massive threading happened a decade ago.

Intel getting Larrabee out will be a massive shot in the arm for the x86 world of massive multithreading.

Interesting how many of the predictions in this thread are already comprehended by SUN's Niagara 2 processor with it's 8 cores and 8 threads/core design. IDC link

Not saying I like the Niagara2 philosophy, just a little concerned that apparently SUN's technical roadmap is so uncomplicated that a thread on the AT forums could have drafted it. ;)

edit: This sun datasheet is much cooler, mostly because it has pretty graphs.