VMWare CEO: Intel's x86 filled with Junk silicon

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,897
3,249
126
all that "junk" rapes its competition in everything you throw at it.

So how does one call it junk when it seems to work perfectly well?
 

masteryoda34

Golden Member
Dec 17, 2007
1,399
3
81
I interpreted it as a criticism of x86 in general. He just used Intel's name because they created x86 and are the dominant manufacturer. At least that was my interpretation.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,897
3,249
126
so that would also imply AMD's are junk because they use x86, and the atom is junk, so is almost every processor out there. :p

My machine is crysis playable, so no i dont think its junk. :X
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Originally posted by: masteryoda34
I interpreted it as a criticism of x86 in general. He just used Intel's name because they created x86 and are the dominant manufacturer. At least that was my interpretation.

who knows...the article was written by Theo Valich. I don't know who is worse, him or Charlie Demerjian
 

masteryoda34

Golden Member
Dec 17, 2007
1,399
3
81
Originally posted by: HOOfan 1
Originally posted by: masteryoda34
I interpreted it as a criticism of x86 in general. He just used Intel's name because they created x86 and are the dominant manufacturer. At least that was my interpretation.

who knows...the article was written by Theo Valich. I don't know who is worse, him or Charlie Demerjian

In this case, Charline is just paraphrasing. His source is here, complete with video of Paul Maritz himself.
http://techpulse360.com/2009/0...ed-for-mobile-devices/
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Originally posted by: aigomorla
all that "junk" rapes its competition in everything you throw at it.

So how does one call it junk when it seems to work perfectly well?

Because it wastes a ton of energy when it doesnt need to. If your processor is just sitting there, why is it still using electricity?
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: drizek
Originally posted by: aigomorla
all that "junk" rapes its competition in everything you throw at it.

So how does one call it junk when it seems to work perfectly well?

Because it wastes a ton of energy when it doesnt need to. If your processor is just sitting there, why is it still using electricity?

It's using a lot less when its sitting there. Just like almost all tech
 

HOOfan 1

Platinum Member
Sep 2, 2007
2,337
15
81
Originally posted by: masteryoda34
Originally posted by: HOOfan 1
Originally posted by: masteryoda34
I interpreted it as a criticism of x86 in general. He just used Intel's name because they created x86 and are the dominant manufacturer. At least that was my interpretation.

who knows...the article was written by Theo Valich. I don't know who is worse, him or Charlie Demerjian

In this case, Charline is just paraphrasing. His source is here, complete with video of Paul Maritz himself.
http://techpulse360.com/2009/0...ed-for-mobile-devices/

he quite obviously was not saying that Intel processors are junk...

Modern day Yellow Journalism...
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,897
3,249
126
Originally posted by: HOOfan 1
he quite obviously was not saying that Intel processors are junk...

Modern day Yellow Journalism...

im sad he called all my machines junk... :(
 

thilanliyan

Lifer
Jun 21, 2005
11,958
2,184
126
Originally posted by: aigomorla
all that "junk" rapes its competition in everything you throw at it.

So how does one call it junk when it seems to work perfectly well?

I also don't think he was comparing it to AMD, etc. It was just on why x86 is not that great.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: drizek
Originally posted by: aigomorla
all that "junk" rapes its competition in everything you throw at it.

So how does one call it junk when it seems to work perfectly well?

Because it wastes a ton of energy when it doesnt need to. If your processor is just sitting there, why is it still using electricity?
You mean when the computer is in a standby state similar to a cell phone or dvd player?

http://tvtool.info/FlashHelp/S....htm#S3_Power_mode.htm
Our measurements have shown that the power consumption in S3 ("sleep") mode is almost exactly the same as if the PC is shut down normally. In worst case the consumption was at 1.8 Watts compared to 1.1 Watts when the PC was shut down.

Different architectures aren't any better. The IBM G5 used in Apple's Power Mac used about twice as much power as a comparable Intel Pentium 4 at the time. Xbox 360 uses a PowerPC processor and it gets hot like you wouldn't believe. Video cards use a completely different architecture, and that's arguably the most power consumping component in many of our computers.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
No, I mean like when the computer is sitting around, doing mostly nothing. As I write this post for example. Ya, it clocks down, undervolts, and so on, but it is still wasting a ton of power. All it is doing is just putting text into a box. It isn't all a result of architecture, I think it is manufacturing methods as well, but I think a good analogy would be to compare a regular LCD with an e-ink display. An e-book reader uses virtually no power when it is just displaying a page, it only uses electricity when it is changing the text.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
It's not a manufacturing process or the architecture that causes it to use so much power. It's simple physics. An Intel Q9550 has over 800 million transistors and they all leak current. You can lower the frequency and you can lower the voltage, but you can't get away from 800 million transistors. An Intel Atom only uses about 5W of power, but it only has about 50 million transistors, 1/16 as many. If scaling was perfectly linear, you would expect a quad core Intel to use 16 x 5 = 80W at ~1.6ghz (the frequency of an Atom). With a bit of underclocking and undervolting, you should be happy if that number gets lower than 50W, even when doing absolutely nothing.

If you want your computer to use less power, don't buy so many transistors. Complaining about idle power consumption of 800 million transistors is as silly as those people who wonder why the 4 cylinder and 6 cylinder models of the exact same car use different amounts of gasoline or people who buy a 1000W PSU and wonder why it has horrendously bad power efficiency when their computer only uses 200W. Golden rule of life: things have higher effiency when they are used close to their rated value. Running something as 10% of its maximum output, regardless of what it is, will never be efficient.

If you want to know why some picture viewer or why my Texas Instruments calculator can last a year on 4 AA batteries, it's because a Zilog Z80 processor only has 8500 transistors.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: masteryoda34
Originally posted by: HOOfan 1
Originally posted by: masteryoda34
I interpreted it as a criticism of x86 in general. He just used Intel's name because they created x86 and are the dominant manufacturer. At least that was my interpretation.

who knows...the article was written by Theo Valich. I don't know who is worse, him or Charlie Demerjian

In this case, Charline is just paraphrasing. His source is here, complete with video of Paul Maritz himself.
http://techpulse360.com/2009/0...ed-for-mobile-devices/

Definitely.

For starters, lets not be gullible here people, ask yourself why VMware is spending money (salary/travel/expenses) so some guy from VMware can trash and bash an ISA (he talks ISA - x86 - but bases his statements on unspecified architecture - "all those gates" - so be wary right there) that has little to do with the specifics of their own revenue and business model...

There is a reason you are being given this "free" advice regarding VMware's opinion on the validity of Intel's "threat" to ARM's marketspace.

My advice - stop watching the commercials, and this is all this is (an advertising campaign), and step above it so you can pay attention to the money trail as that is what speaks to the underlying motivation.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: drizek
No, I mean like when the computer is sitting around, doing mostly nothing. As I write this post for example. Ya, it clocks down, undervolts, and so on, but it is still wasting a ton of power. All it is doing is just putting text into a box. It isn't all a result of architecture, I think it is manufacturing methods as well, but I think a good analogy would be to compare a regular LCD with an e-ink display. An e-book reader uses virtually no power when it is just displaying a page, it only uses electricity when it is changing the text.

Reminds me of some of the modern v8's and such that shutdown cylinders under normal conditions. Could have quad's shutting down 3 cores under light loads
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
It is the old chicken and egg problem. Can a better processor be built if you dropped all the legacy support ? Yes. Now who is going to alienate all their former customers to make that new cpu ?
The same goes for OS. Could MS make a better OS if they got rid of legacy support ? Yes.
I would love to see something like ARM become more mainstream. I have a ARM development board sitting here on my desk. I just don't see it or anything else as a threat to x86 simply because of the software support.

For power efficiency the embedded world kills anything x86. I can set this ARM board to only draw current in the pico amps range when idle.

 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: ShawnD1
It's not a manufacturing process or the architecture that causes it to use so much power. It's simple physics. An Intel Q9550 has over 800 million transistors and they all leak current. You can lower the frequency and you can lower the voltage, but you can't get away from 800 million transistors. An Intel Atom only uses about 5W of power, but it only has about 50 million transistors, 1/16 as many. If scaling was perfectly linear, you would expect a quad core Intel to use 16 x 5 = 80W at ~1.6ghz (the frequency of an Atom). With a bit of underclocking and undervolting, you should be happy if that number gets lower than 50W, even when doing absolutely nothing.

If you want your computer to use less power, don't buy so many transistors. Complaining about idle power consumption of 800 million transistors is as silly as those people who wonder why the 4 cylinder and 6 cylinder models of the exact same car use different amounts of gasoline or people who buy a 1000W PSU and wonder why it has horrendously bad power efficiency when their computer only uses 200W. Golden rule of life: things have higher effiency when they are used close to their rated value. Running something as 10% of its maximum output, regardless of what it is, will never be efficient.

If you want to know why some picture viewer or why my Texas Instruments calculator can last a year on 4 AA batteries, it's because a Zilog Z80 processor only has 8500 transistors.

That IS the problem. Architecture dictates minimum complexity for a workable CPU.
This guy said that x86 is full of junk. Junk means transistors, transistors mean power, power is bad in mobile/handheld situations.

If you want a x86 compatible CPU, you are going to need transistors dedicated to things which don't really matter and won't be that useful, but you need to compatibility. ARM etc doesn't have these requirements, so they have less junk transistors that may hardly ever, if not never, be used, thus reducing the number of transistors, removing the junk transistors, and resulting in a lower power chip.

The article is saying "why use an architecture which is inherently going to be more complex and thus require more power when we have something in place already which is pretty much designed for this area, rather than shoehorning in an inefficient instruction set. Intel are trying to gain dominance with a product which is inherently inferior".
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Lonyo

If you want a x86 compatible CPU, you are going to need transistors dedicated to things which don't really matter and won't be that useful, but you need to compatibility. ARM etc doesn't have these requirements, so they have less junk transistors that may hardly ever, if not never, be used, thus reducing the number of transistors, removing the junk transistors, and resulting in a lower power chip.


The problem is that you cannot compare ARM and x86 in that way. x86 is designed to run applications that already exist. ARM is designed so people can pick the application they want to run, then pick the processor to run it on. If ARM targeted being compatible with existing software applications then it would be in the same position as Intel.

What I think Intel needs to do though is trim the fat. Cut out some of the older instructions and registers that are no longer used by software from more than 5 years and let that part of their compatibility end.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: Modelworks

The problem is that you cannot compare ARM and x86 in that way. x86 is designed to run applications that already exist. ARM is designed so people can pick the application they want to run, then pick the processor to run it on. If ARM targeted being compatible with existing software applications then it would be in the same position as Intel.

What I think Intel needs to do though is trim the fat. Cut out some of the older instructions and registers that are no longer used by software from more than 5 years and let that part of their compatibility end.

You can trim the fat without losing compatibility which is still being done. Instead of having dedicated hardware to do some old instruction, you turn it into a slow ass multi step flow that'll get you the result that you want, just 10x slower by just funnelling it through some other hardware.

Losing compatibility is something that I think the customers would be pissed off at. They probably won't let Intel regularly release a list of instructions that just don't work anymore and have them all recompile everything they have (and used to have).
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: TuxDave

You can trim the fat without losing compatibility which is still being done. Instead of having dedicated hardware to do some old instruction, you turn it into a slow ass multi step flow that'll get you the result that you want, just 10x slower by just funnelling it through some other hardware.

Losing compatibility is something that I think the customers would be pissed off at. They probably won't let Intel regularly release a list of instructions that just don't work anymore and have them all recompile everything they have (and used to have).

I agree.

Intel could take the stance that the cpu that would be needed to run the older applications would be available in the used/refurb market or emulated on a modern pc. Applications that are 5 years behind the current cpu can be emulated pretty well.

The other issue though would be getting programmers to go along. I saw some assembly language code a few days ago that had an instruction I didn't recognize, thought it was a typo. I had to dig out a reference to the 286 cpu to find it. The program had been written this year, but the programmer had been programming since before x86.

 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Modelworks
For power efficiency the embedded world kills anything x86. I can set this ARM board to only draw current in the pico amps range when idle.

But what is your definition of idle? In the world of desktop computers, processors are never idle unless the computer is in S3. If you're using Windows Vista, go into Resource Monitor and look at the disk activity. Even if you're not telling the computer to do anything, it's still shuffling files around, screwing with the memory, updating the clock, indexing files, defragging files, running malware in the background, MSN checking if I'm still connected to the internet, UPS software getting updates from the UPS, the screen updates every time the cursor blinks in this text box, and other random tasks.

Running a complicated system like Windows takes a ridiculous amount of CPU power even if it appears to be doing nothing. Windows taking over 500mb of ram before you even log in should give some indication of how many things are going on.

If you want a x86 compatible CPU, you are going to need transistors dedicated to things which don't really matter and won't be that useful, but you need to compatibility. ARM etc doesn't have these requirements, so they have less junk transistors that may hardly ever, if not never, be used, thus reducing the number of transistors, removing the junk transistors, and resulting in a lower power chip.
(remember that I was responding to a guy complaining about x86 using too much power on his desktop computer)
x86's junk transistors are for legacy support. If an ARM processor were to take over the desktop market (similar to Apple switching from PPC to x86), legacy software would need to be run through emulation which is just as bad as having junk transistors. Remember how slow things are on a Mac when you run them through Rosetta? That's exactly what would happen when using ARM to emulate x86. Something that would take 20% CPU on a so called inefficient design suddenly takes 100% CPU on an efficient design. Emulation is not efficient. Even when it's the same architecture, you can expect to see a 50% performance hit when running a program in VMWare. The performance hit is much much worse when trying to emulate a different architecture.

If Intel starts getting into embedded devices with x86, you can expect to see a lot of junk transistors eliminated. It probably wouldn't even be correct to call it x86 at that point.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
I think the end goal is to drop unused/deprecated instructions and simply emulate anyone applications still using them.

I guess the real problem is how to do it smoothly from a user perspective? .. if old applications break unless you run them in special 'emulation mode' from the OS, or if you have to scan the code for old instructions before executing it, then It quickly becomes a worse situation than simply including an ever decreasing (in die size) part of the CPU for running those instructions.

However for a specific market segment (netbooks/handhelds) .. I could see making a line of processors that doesn't have to support every legal instruction.
 

senseamp

Lifer
Feb 5, 2006
35,787
6,197
126
After dealing with pathetic battery life, noise and heat of my x-86 notebook, I just don't see myself buying another one.
If I get 2 hrs on an x-86 device and 6 hrs on an ARM one, as long as ARM can render websites, let me IM and check email, and play videos and music, and do it all without a fan, I'll go with that for my personal use.
Work machine, company still gets me an x-86 notebook, but honestly, if a device can run VNC viewer, web browser, and email reader, I don't really care what architecture it uses either.
For me, in terms of priorities for a mobile computer, battery life is number one, followed by weight, followed by performance.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: senseamp
If I get 2 hrs on an x-86 device and 6 hrs on an ARM one

The battery life has almost nothing to do with the processor.
http://www.hexus.net/content/item.php?item=12223
The Silverthorne CPU - the beating heart of the Centrino Atom - consumes a maximum of 2.4W at a 1.8GHz clock-speed but still retains the Core 2-derived Intel Merom's instruction set.
.....
The 990g sub-notebook packs in a hyperthreaded Silverthorne 1.6GHz processor that runs off a 533MHz front-side bus - note the two cores in Device Manager. Hooking up to the Poulsbo chipset and outputting video by the GMA 500 IGP, the entire unit consumes around 15W.

CPU only consumes 16% of the power. The other 84% is taken by the monitor, hard drive, ram, and video.