• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel 4-5 year tick/tock cycles when AMD is done?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I think that a new microarchitecture costs a lot less than the multibillions you suppose
I think that someone told AMD that, they tried to meet that target with things like high-level software to design microarchitecture, and the result was Bulldozer.

Then again, I think a partial microarchitecture redesign costs less than a full redesign like Bulldozer. And I don't think Intel has had a successful full redesign since...Pentium Pro? PII and PIII were partial redesigns, P4 was a bust, and PM and Core continued on from PIII. Unless Core 2 was a full redesign? I think the rest have been partial, incremental redesigns from Core 2.
 
I think that someone told AMD that, they tried to meet that target with things like high-level software to design microarchitecture, and the result was Bulldozer.

Then again, I think a partial microarchitecture redesign costs less than a full redesign like Bulldozer. And I don't think Intel has had a successful full redesign since...Pentium Pro? PII and PIII were partial redesigns, P4 was a bust, and PM and Core continued on from PIII. Unless Core 2 was a full redesign? I think the rest have been partial, incremental redesigns from Core 2.

Core 2 was a pretty much redesign.
And Nehalem was a redesign at least on par with the more from Athlon to Athlon 64, and certainly more so than Athlon to Phenom.
 
The thing is Intel can still focus going to 14nm and under but they can focus the CPU design to tablet/low power device design. ie Atom.

And extend the cycles for desktop. The recent Haswell refresh announced is definitely what I see becoming the norm.

Majority of desktops have 4 cores sitting idle most of the time.
 
The thing is Intel can still focus going to 14nm and under but they can focus the CPU design to tablet/low power device design. ie Atom.

And extend the cycles for desktop. The recent Haswell refresh announced is definitely what I see becoming the norm.

Majority of desktops have 4 cores sitting idle most of the time.

Desktop cycle is increased due to Atoms. Laptops and servers still get 14nm right away. And when a new uarch comes (Skylake). They all get updated the same time since it uses the established node. Before Atoms got higher priority, it was Atoms being behind on the node process.

Its simply a capacity issue. If they could, they would do all on 14nm from the start.
 
Desktop cycle is increased due to Atoms. Laptops and servers still get 14nm right away. And when a new uarch comes (Skylake). They all get updated the same time since it uses the established node. Before Atoms got higher priority, it was Atoms being behind on the node process.

Its simply a capacity issue. If they could, they would do all on 14nm from the start.

So would Intel benefit from MORE CapEx spending? (More fabs?) I thought that they cancelled putting the equipment into a couple of fabs that were slated for 14nm capacity.

Intel must really be betting that there will be customers for their 14nm Atom chips. Do they have any non-PC-OEM design wins. (Aka above and beyond Dell/Asus tablets.)
 
So would Intel benefit from MORE CapEx spending? (More fabs?) I thought that they cancelled putting the equipment into a couple of fabs that were slated for 14nm capacity.

Intel must really be betting that there will be customers for their 14nm Atom chips. Do they have any non-PC-OEM design wins. (Aka above and beyond Dell/Asus tablets.)

It seems to be an issue about how much the toolmakers can supply.
 
With AMD dying.

Eh? AMD has been in the black for the last two quarters. AMD hasn't been trying to compete with Intel on the performance front for a few years now...

"We're at an inflection point," said company spokesman Mike Silverman. "We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore."
Although AMD has been vague about its plans, the company is widely expected to push hard to get its chips into smartphones and tablets. Those markets not only are dominated by other companies, but its gargantuan archrival is trying to elbow its way into them, too -- potentially moving the war with Intel onto a new battleground.
I would rather go for average fps than just use the maximum fps.
I would rather use the minimum FPS...
 
Last edited:
Back
Top