Intel CPUs in 7-10 years

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,459
5,845
136
I.e. Intel will follow AMD's strategy, and focus on "compute cores", whether CPU of GPU cores.

Similar, but with all cores running the same instruction set. Making CPU cores that can handle graphics efficiently and doing away with GPU cores.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
My theory- Larrabee finally comes to fruition, and Intel replaces integrated graphics with x86 cores plus texture units. A consumer SoC has a handful (about 2) high frequency, out of order, very wide CPU cores for single threaded performance, and dozens of in-order simple cores handling multithreaded workloads like graphics. Of course this will meed a fairly major rethink of OS scheduling techniques.

Plausible, very plausible.

My theory is that the entire computing industry will become as blasé as the scientific calculator is today.

Yeah they exist, still, and yeah every year Casio and Texas Instruments roll out yet-another-model with indistinguishable improvements over last year's model, but that is what it has come down to for them.

Digital cameras reached that point a couple years ago as well, how many mega-pixels do you need (or want) in your canon these days?

Same with printers, remember the mad rush to have better and better inkjet resolution...and then it all of sudden hit some number (was it 1200dpi?) and it was like the market fell off a cliff and no one cared anymore?

Same with computers, GHz and GBs are all pretty much saturated marketing points and selling points by now.

I'm not worried about Intel's CPUs in 7-10 yrs, but I am sure Intel is.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Except that we will always need (want) more computing power. At least some people. Computers also go into new form facts when they become powerful enough, while cameras for example are limited by other things like the lens.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
From your link:

"Realistically speaking, [...]"

:hmm:

10 years later still not even close to 10GHz.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
If Intel doesn't start delivering a lot more performance gains in that period then I think its fair to say computers and the changes we just went to do come to a grinding halt. We will rapidly hit the end of improvements in a lot of areas and it will probably lead to Intel going bust. They don't have a choice, they ever deliver more performance or they will disappear.
 

NTMBK

Lifer
Nov 14, 2011
10,459
5,845
136
If Intel doesn't start delivering a lot more performance gains in that period then I think its fair to say computers and the changes we just went to do come to a grinding halt. We will rapidly hit the end of improvements in a lot of areas and it will probably lead to Intel going bust. They don't have a choice, they ever deliver more performance or they will disappear.

Nah, x86 is too valuable for legacy support. Worst case, Intel's chip design division become a Via style fabless designer for legacy business apps. But they're a long way from that, and their fab tech is peerless and will last a long time.
 
Aug 11, 2008
10,451
642
126
I think intels future hinges on how much ARM impinges on the server and enterprise market. I see Intel as remaining viable, but I am not sure they can maintain their high margin business model. It also depends on the overall health of the world economy. If there were another major shock like 2008, that could spell big trouble for Intel.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
If Intel doesn't start delivering a lot more performance gains in that period then I think its fair to say computers and the changes we just went to do come to a grinding halt. We will rapidly hit the end of improvements in a lot of areas and it will probably lead to Intel going bust. They don't have a choice, they ever deliver more performance or they will disappear.

Intel's current consumer predicament is caused by their ONLY core competency of making traditional x86 chips that has greatly overshot the needs of the vast majority of the consumers...And you think they will making chips much faster so as to make upgrade cycles even more unbearable to them?

*laughter*
 

Sattern

Senior member
Jul 20, 2014
330
1
81
Skylercompany.com
I think Intel will start working on other things besides computers, laptops and other common items that use processors.

You should see a technological revolution within 15-20 years where robots in factories become more mainstream than they are.

Don't expect iRobot technology, but more things like manufacturing and industry will be the first to advance.

Then once Google finishes their automated cars there should be consumer product lines where robots could be useful, especially for older people.
 

cytg111

Lifer
Mar 17, 2008
26,246
15,660
136
remember that the circle is divine, everything comes in circles, overdamped or underdamped. If we'll have to put moores law on pause for a second, it wont be all that bad, there is plenty of performance to be extracted from the hardware we allready have. Way too many high level languages, enterpreted languages that eats CPU and memory like a *censored*. Maybe we will use the time to start optimizing our computes instead of just throwing more hardware at it.
Relevant XKCD : http://xkcd.com/676/
How many cycles/ticks do you *really* need to render that cat?
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
If Intel doesn't start delivering a lot more performance gains in that period then I think its fair to say computers and the changes we just went to do come to a grinding halt. We will rapidly hit the end of improvements in a lot of areas and it will probably lead to Intel going bust. They don't have a choice, they ever deliver more performance or they will disappear.

I think we'll start seeing hybrid compute chips, perhaps with application specific or programmable logic.

Intel is already selling CPUs with custom application specific add-ons to big enough, and rich enough customers. E.g. Oracle have announced database servers which are based on custom Xeon CPUs with database specific enhancements.

It isn't too much a leap to conceive of programmable logic cells/FPGA cores. Intel already provides foundry services for 3rd party FPGA vendors, so their manufacturing technology isn't a problem; I could imagine that some sort of technology licensing could happen in the next few years.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
We will have approximately 1.08^7 = 1.71 => 71% faster Intel CPUs in the early 2020s. Power consumption on the low performance SKUs will be a lot lower though. That is if going by the track record for that last few years.

Heh heh, I was smarmily going to say something like this too :), in the next 7-10 years we'll see something between: +1.05^7 -> +1.10^10 = ~+41% -> ~+160%
 

Hulk

Diamond Member
Oct 9, 1999
5,163
3,781
136
Funny thing is Prescott at 3.8GHz was introduced 10 years ago. 10 years ago. Think about that. And for all intents and purposes we are still at that clock speed. But Intel has found a way through increased IPC and multicores to make CPU upgrades still relevant for most people.

They want to make money so they'll find a way to make something we just must have.

I think we'll see more cores and better software/hardware optimization to get increased performance. And of course the big advertising push will be that we can have our current desktop computing power in small devices with long battery life. Which is where for better or worse the market is headed.

Ain't it lucky for Intel that the consumer push for smaller more efficient computing devices came at a time when clockspeed and IPC were pretty much topping out and node shrinks are really only providing lower TDP anyway? Funny how things work out.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Except that we will always need (want) more computing power. At least some people. Computers also go into new form facts when they become powerful enough, while cameras for example are limited by other things like the lens.

All things "max out" and plateau once the value they add to the consumer's experience with the device of interest begins to max out and plateau.

Digital camera resolution max'ed out and plateau'ed because very few people care to print out their pics larger than 5x7, or worse they only care to see them digitally on a 96-100dpi screen anyways.

Same with computing. The target audience for consumer PCs, the meaty middle 80%, is already saturated with today's compute capability. There is no killer app for the masses to drive crazy high replacement rates like the internet did. (the last, and perhaps only, true killer app for the masses)

Ain't it lucky for Intel that the consumer push for smaller more efficient computing devices came at a time when clockspeed and IPC were pretty much topping out and node shrinks are really only providing lower TDP anyway? Funny how things work out.

I can see how it might seem like one big coincidence from certain perspectives, but I think the reality is that the market always existed for low power consumption and mobile devices but the performance trade-off when going low(er) power consumption and mobile was simply a deal killer in prior decades.

So it may just be that while the internet was the killer app for the masses say from 1993 to 2010, the killer app for the masses for 2010 to 2025 will be mobility.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Ain't it lucky for Intel that the consumer push for smaller more efficient computing devices came at a time when clockspeed and IPC were pretty much topping out and node shrinks are really only providing lower TDP anyway? Funny how things work out.

Too bad for Intel this shift occured when ARM-based companies can produce chips that are "good enough" with cost advantages, better end product-based vertical integration, more flexible business models and equipped with far more critical IPs such baseband modems to succeed in the consumer space. Heck, Qualcomm can pretty stop produce any SoCs altogether and still retain 2/3 of their total revenue through IP licensing alone.
 

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
Ain't it lucky for Intel that the consumer push for smaller more efficient computing devices came at a time when clockspeed and IPC were pretty much topping out and node shrinks are really only providing lower TDP anyway? Funny how things work out.
Heh. That is interesting.

Too bad for Intel this shift occured when ARM-based companies can produce chips that are "good enough" with cost advantages, better end product-based vertical integration, more flexible business models and equipped with far more critical IPs such baseband modems to succeed in the consumer space. Heck, Qualcomm can pretty stop produce any SoCs altogether and still retain 2/3 of their total revenue through IP licensing alone.
Another interesting point.

So it may just be that while the internet was the killer app for the masses say from 1993 to 2010, the killer app for the masses for 2010 to 2025 will be mobility.
Somebody wake me up in 2026.
icon_e_sleeping.gif
 
Mar 10, 2006
11,715
2,012
126
Too bad for Intel this shift occured when ARM-based companies can produce chips that are "good enough" with cost advantages, better end product-based vertical integration, more flexible business models and equipped with far more critical IPs such baseband modems to succeed in the consumer space. Heck, Qualcomm can pretty stop produce any SoCs altogether and still retain 2/3 of their total revenue through IP licensing alone.

It really pays to get your facts straight:

3CGCvwf.png


Much more than 1/3 of Qualcomm's revenue would go away if it stopped selling chips.
 

CrazyElf

Member
May 28, 2013
88
21
81
We'll be at 10GHz @ < 1V by 2005.

http://www.anandtech.com/show/680/6

Dennard scaling died. The issue I think is by the Intel P4/AMD K8 era, PC's had already hit that "good enough" threshold. For the average person, they need:
- Something that can browse the web
- Something that can type up a Word Document, Excel sheet, or a Power Point
- Maybe a few other apps, like video playback, nothing too CPU intensive

I mean, I as an enthusiast even seem to be reaching a good enough point too.

- Games (GPU more so than CPU is the limit there)
- Need something for video encoding (main reason why I am looking to get an X99 system some time next year)
- Bigger hard drives (video files take a lot of space)

I guess that in turn drives what I am excited about:
- SSDs, but not as much as before
- New hard drive technology
- GPU technology

It seems to me that single threaded performance has topped out. Let's say you had a 5.0 GHz i7 2600k @1.45V. So pretty decent luck on the silicon lottery. That 3.5 year old processor still is pretty fast. A 4.7-4.8 GHz Haswell chip today is not that much faster. It's only the things that can use the new instruction sets that seem to give decent per clock gains. Perhaps you can only optimize things like branch predictors to a certain level? Silicon too seems to be hitting its limits. If you think about it, the battle was always against heat and leakage. That and economics (fabs are exponentially more to build as the process goes down). But as you shrink, the benefits per shrink go down as you approach the atomic scale. It was inevitable that heat, leakage, and economics would win the day.

I personally am skeptical that new technologies will bring back, say Dennard scaling. Lately it seems a lot of promising stuff has gone the wayside. EUV, 450mm wafers, and the latest processes all seem to have issues. 20nm TSMC and 14nm Intel seem to be delayed and likely, problem plagued.

I think that there's still some more performance to be had from GPUs, owing to the parallel nature of the GPU. But my guess is that even that will hit the top.

After that, well, computers will be as exciting as maybe cars. Mature technology, and only some incremental improvements per generation. Maybe less than that even?
 
Last edited:

cytg111

Lifer
Mar 17, 2008
26,246
15,660
136
Yea? This is how it will go down. VR is about to get its second coming. Then porn. OMG VR porn is so real, I need more real! .. More real means higher res, better AI, better everything. As alwasy, porn, been driving the internet since forever.
Augmented is creeping up on u folks, and it needs compute power.
 

Kippa

Senior member
Dec 12, 2011
392
1
81
Just because the chips can't get an smaller doesn't mean the tech will end there. If I were a betting person I would bet on 3d stacking.
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
Out of curiosity, in a "System Bench" type benchmark with a varied workload, at what frequency would a P4 need to operate to keep pace with one core of a non-overclocked i7 4790k?