AMD?s planning an 'escape from confines of Intel?s x86 instruction set'

Diasper

Senior member
Mar 7, 2005
709
0
0
I found a small article about the (possible) future trajectory AMD is planning for its CPU architecture and that it is looking to do a complete re-take on the x86 instruction set. Given the recent developments in non x86 archiectures I might have dismissed it but otherwise it's got me interested - Where could AMD really go to 'escape from the confines of Intel's x86 architecture'?


Article from infoworld.com

One strategic path that will knock you for a loop, and which I?ll detail soon, is AMD?s coming escape from the confines of Intel?s x86 instruction set. To this point, AMD has resisted the temptation to overhaul the x86, even though it sorely needs it. When Fab 36 cranks up, AMD will overcome that fear. AMD64 processors will take on performance, scalability, resource management, and availability-related instruction set extensions that will be proprietary to AMD CPUs. Don?t freak out: AMD will keep its contract to be 100 percent compatible with Intel-standard processors. But the idea of seeing ?optimized for AMD64? stamped on software boxes delights me. Another journalist at the same event posited that AMD?s technological lead over Intel will be short-lived and is calling ?game over? once Intel?s new Pentium M-derived cores debut across the product line. With due respect to my colleague, AMD will extend its lead, showing Intel?s reactive strategy for what it is.
 

kpb

Senior member
Oct 18, 2001
252
0
0
They've already done it. It's called x86-64 or what ever you wanna call it. They will just define another extention to the x86 architecture just like they did in the past. They tried it with 3dnow and it didn't catch on. They tried it with 64 bit and forced intel to follow thier lead. They'll do it again and who knows what will happen. This may be talking about thier virtualization stuff, I forget what they call it now, or some other tech they still working on.
 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
i am interested to see what pm, ctho, tuxdave, etc. have to say about this. they are the only ones here that really have the right to voice an opinion since no one else knows squat about the innards of a processor.
 

pinion9

Banned
May 5, 2005
1,201
0
0
Originally posted by: Bigsm00th
i am interested to see what pm, ctho, tuxdave, etc. have to say about this. they are the only ones here that really have the right to voice an opinion since no one else knows squat about the innards of a processor.

Just because you don't understand how processors work doesn't mean no one else does. Did you have a specific question about how they work?

I agree with kbp, though. If they stay 100% intel compliant, they will either add instruction set extensions or emulate x86 (like the Crusoe did.) I doubt they will invent a new architecture or move to EPIC or anything of that sort.

 

MrDudeMan

Lifer
Jan 15, 2001
15,069
94
91
Originally posted by: pinion9
Originally posted by: Bigsm00th
i am interested to see what pm, ctho, tuxdave, etc. have to say about this. they are the only ones here that really have the right to voice an opinion since no one else knows squat about the innards of a processor.

Just because you don't understand how processors work doesn't mean no one else does. Did you have a specific question about how they work?

I agree with kbp, though. If they stay 100% intel compliant, they will either add instruction set extensions or emulate x86 (like the Crusoe did.) I doubt they will invent a new architecture or move to EPIC or anything of that sort.


i understand more than most, but not nearly as much as those guys. chill out. geez.
 

Atheus

Diamond Member
Jun 7, 2005
7,313
2
0
Perhaps it won't just be another extention... maybe they will make their own instruction set and include inlcude the ability to translate x86 into proprietary opcodes. I understand this already happens to some degree though, I think most modern processors are basically RISC machines at heart.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
i am interested to see what pm, ctho, tuxdave, etc. have to say about this. they are the only ones here that really have the right to voice an opinion since no one else knows squat about the innards of a processor.

My guess is they're talking about Pacifica. Intel has something similar called Vanderpool.

The switch to the 65nm process is one AMD will take when the market will benefit from it.
That quote amused me :)
 

kpb

Senior member
Oct 18, 2001
252
0
0
Originally posted by: CTho9305
My guess is they're talking about Pacifica. Intel has something similar called Vanderpool.

Yeah thats the virtualization stuff I was thinking of. That would be my guess too but it's possible they are doing something else like sse4 or something else. Might be interesting to do some x64+ and add more architectual registers and an normal register based ftp instead of the standard stack based 387 or sse hack around.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Beats the hell out of me. Vanderpool/Pacifica is just another privilege level, has nothing to do with revamping x86. Intel has been trying to get IA-64 into the mainstream with no success for years now. Perhaps AMD is talking about platform features. Who's reactionary now, LOL!

Anyways, intel has been working on *T for quite some time now, they are technologies which extend the capability of the processor and allow closer workings with software. It isn't breaking away from x86, but extending traditional processor capability. I assume AMD is working on similar features.

Back to x86 issue, it is here to stay, and software trends still demand robust single-thread performance. That means x86 circuitry will remain at the forefront of any core that wants to stay in the mainstream market. For the time being, the only thing I can see is baby steps to extend x86. When designers figure out a way to make emulated x86 go fast or when single thread perf demands taper off, then maybe there will be talk of a new uarch to replace x86, provided there is enough real estate to work with, and other concerns such as power have been addressed.

Of course that brings about an even more important question... if x86 can satisfy single-thread performance demands, why bother replacing it?
 

stardrek

Senior member
Jan 25, 2006
264
0
0
I hope AMD creates a better floating point instruction set. Their old one has been kinda weak and a new means of doing this could really help them out in the 'big' server market.
 

Titan

Golden Member
Oct 15, 1999
1,819
0
0
Well, we already have an on-die memory controller, so why not more integration? PCI express connector, 10 gigabit ethernet controller, RNG, crypto unit, and GPU?

What process do standard chipsets use? Could we shrink a southbridge and northbridge on die with not too much 65nm real estate being used?

I'm curious to see how that tech that reduces the cache area by a factor of 5 plays into things.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Incorporating those things into the CPU is a terrible idea because it will take up expensive real estate and all the bad things that go with it (cost, yield, effort, external interface), questionable ROI for all that effort, and the inflexibilty of an on-die implementation especially for tech that could become outdated in a couple years.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
I hope AMD creates a better floating point instruction set. Their old one has been kinda weak and a new means of doing this could really help them out in the 'big' server market.
Opterons kick butt at floating point (note, those numbers are old - you can get much faster Opterons now than they had then). What are you basing your statement on?
 

borealiss

Senior member
Jun 23, 2000
913
0
0
Originally posted by: Titan
Well, we already have an on-die memory controller, so why not more integration? PCI express connector, 10 gigabit ethernet controller, RNG, crypto unit, and GPU?

What process do standard chipsets use? Could we shrink a southbridge and northbridge on die with not too much 65nm real estate being used?

I'm curious to see how that tech that reduces the cache area by a factor of 5 plays into things.


integration has its costs. amd would also be alienating much of the chipset partnerships that help its business model. this is why amd is not producing any more chipsets. after lokar/thor they are done.

the ondie memory controller is great for performance, but if you can get close to this type of performance without an ondie mem controller, it is better. there are many headaches associated with having a fixed piece of silicon tied to something as important as the type of memory in your computer. memory markets change. look out how cautious they are about committing to ddr2. it would require an entirely new mask design just to remain competitive, and performance benefits are questionable. the athlon64 w/ 2 memory channels is, on average, 10-15% higher than a single channeled variant. adding 2x the bandwidth yielded marginal performance increases. ddr2 promises even more, but can the architecture take advantage of it? with an offdie memory controller, you can still pump out your core cpu and change the platform slightly. modularity is better for manufacturing, time-to-market, and has less engineering overhead than full on integration.

The switch to the 65nm process is one AMD will take when the market will benefit from it.

Yonah...
 

irwincur

Golden Member
Jul 8, 2002
1,899
0
0
Perhaps this is in reference to the new co-processor helper chips that so amny think AMD is going to create.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: kpb
They've already done it. It's called x86-64 or what ever you wanna call it. They will just define another extention to the x86 architecture just like they did in the past. They tried it with 3dnow and it didn't catch on. They tried it with 64 bit and forced intel to follow thier lead. They'll do it again and who knows what will happen. This may be talking about thier virtualization stuff, I forget what they call it now, or some other tech they still working on.

I don't know if I'd put 3dnow in quite the same category. X86-64 is a superset of x86, but 3dnow is more like a superset of just MMX. 3dnow also wasn't really needed once video cards came around, and SSE was better anyway.

BTW, with x86-64 it was rumored that AMD would completely overhaul their SIMD instruction sets, but instead chose to adopt SSE.

if x86 can satisfy single-thread performance demands, why bother replacing it?

If things like Cell show that they can perform much better.
I think replacing x86 would mostly be to simplify things, it would make extensions to the cpus simpler and ease assembly coding.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Absolutely, but because of the issues I listed above, it will be quite difficult to find "something better". Cell has proven itself to be an economically unfeasible scientific experiement, kinda like HP's approach to itanium.

I think x86 actually makes adding extensions easier, since ucode allows arbitrary backend implementations. As for assembly coding, yes it will be much easier to do hand massaged performance code, but that seems like a niche market.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
everyone knows that the x86 instrucxtion set is terribly outdated, and held together by a bunch of hacks and additions to try to make it productive in the modern era of computing. The fact of the matter is that it isn't really all that hard to create an instruction set that is better suited for todays computers and software, the problem is that every computer nowadays, and every program is written in x86 code, so if you switch ISAs every piece of software will be useless untill companies are able to recompile and optimize the source code fo the new ISA. Also, old computers will be unable to run new software.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Tell me how x86 is not sufficient for today's consumer (high end to low end) workload, and why it is a bunch of hacks, since it is just a definition.

You'd be surprised how much can be done with something that "outdated".
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: dmens
Absolutely, but because of the issues I listed above, it will be quite difficult to find "something better". Cell has proven itself to be an economically unfeasible scientific experiement, kinda like HP's approach to itanium.

I think x86 actually makes adding extensions easier, since ucode allows arbitrary backend implementations. As for assembly coding, yes it will be much easier to do hand massaged performance code, but that seems like a niche market.

Cell hasn't even hit the market yet, don't be so harsh to it! Especially when future cores are either going to go the way of Cell, or the way of old x86 cpus....as in using coprocessors.

Anyhow, I think all modern processors use ucode, that still doesn't mean the instruction set can't be better. (though for performance, I'd say x86 is arguably better than alternatives like powerpc, but falls short in many other ways)

Strange that you say Cell is dead, yet say assembly coding is a niche market. Someone has to make the compilers, and no matter how skilled a poor instruction set can make certain things unfeasible.
 

cheesehead

Lifer
Aug 11, 2000
10,079
0
0
I'm no processor designer, but I think I've noticed something that might be a clue: A large percentage of their processors are actually made by Big Blue, and in turn, Big Blue is using them in quite a few servers. IBM, in turn, has rights to the CELL processor, and the technology that makes it work. The CELL, which will be in production in the very near future, should be 50$ or so in a similar configuration to the PS3, and so long as AMD and IBM work together, they could give a direct link to the processor through, say, a secondary socket or slot. (And the chipset makers would love this, as they could still use their old chipsets.)
Also, AMD might liscense a physics instruction set, allowing them to make "gaming" processors that can actually out-perform an Opteron in games. Or do a combination of the above.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Cell isn't dead, I just have a lot of doubt about its commercial viability. It is very diffuclt to program, and even with IBM's immense compiler/dev support, not many people are willing to work with it. Some playstation developers declined to do work on the PS3 simply because of cell. It is a float monster, but that's basically all it is good for, hence it is not suitable for general computing.

When a substantial demand exists for extreme float loads, then cell will thrive.