• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Intel charged of monopolistic practices: on account of both MPU and graphics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hard Ball

Senior member
Jul 3, 2005
594
0
0
The x86 part was exactly what my line of thinking was too when I read the FTC's and Intel's statement as well.

This is exactly where I see this headed, either forcing Intel to make its x86 license more readily available and license-able or bust them up.
I think that's something that many parties in the industry have been waiting for; an open x86 landscape would one of the few options for guarantee of long term competition in the client computing industry. One alternative is eventually for the government to regulate the production and pricing of x86 products, as in the the way that AT&T and the Fed worked from 1956 til 1984.

The other alternative would be as you said, to bust up Intel into components, and possibly the majority of general purpose microarch design and manufacturing remaining in a single entity; while vertically integrated business like chipset, graphics, software, flash mem, communications, etc could be parsed into other players in the industry like NV, IBM, Broadcom. It's also possible that one of the design facilities would go to AMD. This is a very far fetched scenario, and would only occur when all else has failed. But this did happen once in relatively recent US history, so it is not completely out of the realm of possiblities.

In terms of busting them up there are some sizable entities that could become stand alone businesses: graphics division, x86, Itanium (no reason to keep with x86 business entity), intel capital, compiler division, chipset division.

Forcing whoever ends up with the fabs to continue to fab products for the fabless business entities would not be an issue either, precedence exists back when DEC went under and Compaq bought DEC, the DOJ required both Intel and Samsung to fab the Alpha processor for a few years after the deal was complete.
Yes, I see that if Intel is ever broken up in that manner, the most likely division to be parceled to NV would be the chipset division; this would guarantee for some time that the new Intel would not again have control over its own its own platform, hence no more pricing tactics that would be deemed improper.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Let's hear from the leader himself:

Jen Hsun Huang said:
The U.S. government announced today that it has filed an antitrust lawsuit against Intel. This is an action the industry needs and one that consumers deserve. And it’s one that can completely transform the computer industry.

The facts are clear. The FTC has charged that Intel has used its monopoly illegally to stifle innovation, to keep prices for their products inflated, and to unfairly block competitors. The FTC believes that millions of consumers have paid more and received less quality in return–and that companies and their employees have been forced out of markets where Intel has been threatened.

Intel is fully aware that great graphics have become one of the most important features for consumer PCs, the fastest-growing segment of the PC market. Even more alarming to Intel is the revolutionary parallel computing technology in our GPUs that is being adopted by software developers across the world. The more successful we became, the bigger threat we were to Intel’s monopoly. Instead of creating competitive GPU solutions and competing on the merits of their products, Intel has resorted to unlawful acts to stop us. The FTC announced today that this isn’t acceptable.

Nothing this complicated gets decided quickly. It will take months for the FTC case to be heard by an administrative judge who will then recommend a ruling back to the FTC. And it’s possible that this decision could be appealed. But today is a huge step forward for all of us that will begin to re-level the playing field.

Today’s FTC announcement highlights the industry-changing impact of the GPU and the importance of our work. Our innovation is making the PC magical and amazing again. I can now imagine the day when Intel can no longer block consumers from enjoying our creation and experience computing in a way we know is possible.
http://www.nordichardware.com/news,10438.html

So we are getting a pretty good picture of what NV would like to happen with this case, from an excellent source. At the minimum that would involve no more bundling MPU and chipsets; providing options for non-MCM and integrated die designs of future microarchitectures to allow third party graphics; and open up interconnect standards like QPI and DMI.

It remains to be seen whether NV would actually ask for an x86 license. It's anyone's guess.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Yes, I see that if Intel is ever broken up in that manner, the most likely division to be parceled to NV would be the chipset division; this would guarantee for some time that the new Intel would not again have control over its own its own platform, hence no more pricing tactics that would be deemed improper.
So you'd give a chipset monopoly to nvidia on the Intel platform? I guess that'd force them to buddy up and all, but it sounds like a bad idea.

BTW, it's pretty funny to hear nvidia's comments on this, when they're probably guilty of a few of the same tactics as Intel (but nvidia isn't in a monopoly position).


Oh, and the software industry could also diverge from x86. Microsoft could just switch completely over to managed code with different virtual machines, so that the same code base runs on more than one architecture by using an intermediary language. Older software could either be shoe-horned into the virtual machine (maybe not possible without source code, other than an x86 emulator, but Apple did it so hey why not). Could full on virtualization (of the vmware variety) be used to run x86 on another architecture?
 

Idontcare

Elite Member
Oct 10, 1999
21,118
57
81
It remains to be seen whether NV would actually ask for an x86 license. It's anyone's guess.
Well you are no stranger to the challenges of the design business, do you think it is at all viable for a company of Nvidia's resources and talent base to do much of anything with an x86 license in the face of the deep-pools of experience, talent, and IP portfolios at both AMD and Intel that they are putting to work into the designs they plan on selling in 2013?

I don't think the barrier to entry there is the license battle...that is a fast moving train and Nvidia would be trying to catch up to it some 30yrs after it left the station. They'd stand better chance chasing after everything that uses ARM than diving into x86 IMO.
 

akugami

Diamond Member
Feb 14, 2005
4,633
69
91
x86 license is not transferrable through the sale of AMD to another entity AFAIK.
I'm not sure what exactly is in the new Intel/AMD licensing agreement. Maybe there is some provision in there that allows the AMD license to transfer to any new owner? I dunno, just random thoughts here. Maybe someone more in the know can comment.

An x86 license from Intel isn't necessary to make an x86 CPU. It's just that both AMD and Intel have added so much to the x86 architecture that making a modern x86 CPU would require licenses from not just Intel but AMD as well.

Oh, and the software industry could also diverge from x86. Microsoft could just switch completely over to managed code with different virtual machines, so that the same code base runs on more than one architecture by using an intermediary language. Older software could either be shoe-horned into the virtual machine (maybe not possible without source code, other than an x86 emulator, but Apple did it so hey why not). Could full on virtualization (of the vmware variety) be used to run x86 on another architecture?
I think there already is a likely candidate challenging the x86 dominance and that is ARM processors. A lot of modern smartphones and other devices use ARM processors. Palm Pre, Apple iPhone, Nintendo DS's, Tegra based devices all use ARM CPU's. ARM CPU's will get more powerful and the truth is they are probably at a stage where they're good enough for the majority of what people use computers for. Basically web browsing, email, word processing, and viewing videos. And ARM processors will only get better and hopefully more power efficient.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I don't think the barrier to entry there is the license battle...that is a fast moving train and Nvidia would be trying to catch up to it some 30yrs after it left the station. They'd stand better chance chasing after everything that uses ARM than diving into x86 IMO.
They were able to make a chipset pretty quickly and come out with tegra without taking 30 years.

It's not like they would need to start from scratch.
http://www.tomshardware.com/news/nvidia-transmeta-x86-gpu-cpu,8997.html
 

Idontcare

Elite Member
Oct 10, 1999
21,118
57
81
So you think if they were handed an x86 license today they'd have a product capable of competing with Haswell in 2012-2013? I don't share that confidence but perhaps I am just overly pessimistic for unjustified reasons.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
So you think if they were handed an x86 license today they'd have a product capable of competing with Haswell in 2012-2013? I don't share that confidence but perhaps I am just overly pessimistic for unjustified reasons.
They might be able to compete with Atom by that point.

But if Intel with far more resources can't catch up to nvidia in graphics in a few years time, I doubt nvidia is going to catch up to Intel on their turf in under a decade. ARM is different, the designs aren't as complex, and can't ARM core designs be licensed directly?
 

Idontcare

Elite Member
Oct 10, 1999
21,118
57
81
They might be able to compete with Atom by that point.
Compete with the atom of today or the atom Intel will have out at that point in time?

And are you talking just performance or performance/watt?
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Compete with the atom of today or the atom Intel will have out at that point in time?

And are you talking just performance or performance/watt?
Err, both? The atom doesn't have great performance, or great performance/watt, it's just a low watt x86 cpu that performs decently. I doubt it's standing will change much as time goes on, intel's ULV chips will still be the kings of performance/watt, and plenty of other mobile chips will best atom in performance/watt.
For atom, it seems like the relevant characteristic is more so performance/cost, and low power usage just comes as part of that. I don't think nvidia would be able to beat future atom on performance/cost.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
57
81
I only asked because is not exactly clear to me where fabless Nvidia is going to gain access to the kind of Intel-caliber process technology that is needed to make the power-consumption be what it is, and that gap is only going to grow larger over time.

I think AMD, with all their experience in making x86-based products, would be doing good if Bobcat competes with Atom in both performance and performance/watt.

I don't hold that confidence for Nvidia, I guess after getting to work with SUN as much as we did at TI and seeing firsthand just how complex the task is when it comes to designing microprocessors I don't share the confidence that you and others here have in thinking Nvidia just needs a license and a few years at the drawing board and magic will happen for no real good reasoning (wishful hoping at work maybe?). (we also happened to design and fab our own x86 processors at TI, this was back in 1995 timeframe though, but the challenges left quite an impression on me)

Cyrix and Via never really challenged AMD or Intel for markeshare, and transmeta came and went even more quickly.

Competing with ARM products is reasonable and feasible because everyone in that competitive circle is relying on fabs that operate with roughly equivalent process technology and node release timelines, so design makes all the difference and that is the only place Nvidia can differentiate themselves (at the design level) since they are fabless.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
So you'd give a chipset monopoly to nvidia on the Intel platform? I guess that'd force them to buddy up and all, but it sounds like a bad idea.

BTW, it's pretty funny to hear nvidia's comments on this, when they're probably guilty of a few of the same tactics as Intel (but nvidia isn't in a monopoly position).


Oh, and the software industry could also diverge from x86. Microsoft could just switch completely over to managed code with different virtual machines, so that the same code base runs on more than one architecture by using an intermediary language. Older software could either be shoe-horned into the virtual machine (maybe not possible without source code, other than an x86 emulator, but Apple did it so hey why not). Could full on virtualization (of the vmware variety) be used to run x86 on another architecture?
It's possible, although I'm not sure what exactly would nVidia would be guilty of, except for some lame lock-in of Physx onto their own graphics drivers. I'm sure FTC would not be interested in it unless it affects the market in some significant way (which Physx is definitely not, at this point in time).

I'm not sure what you mean by intermediate language. Perhapes you mean some intermediate representation that is used in compilers (like 3 address code, or PDA like stack oriented representation in JVM) primarily used during multi-pass optimzations. This is not exactly ideal, since it is neither highly optimized, or guarantees compatibility without some software trap to handle exceptional cases.

A VM architecture in the OS would work. But I guess the question then would be how much performance as well as (perhaps more importantly) power efficiency would you be willing to sacrifice; this is especially important in an age of ever slimmer and lighter client platforms; most of the time it would not be advisable.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
It's possible, although I'm not sure what exactly would nVidia would be guilty of, except for some lame lock-in of Physx onto their own graphics drivers. I'm sure FTC would not be interested in it unless it affects the market in some significant way (which Physx is definitely not, at this point in time).

I'm not sure what you mean by intermediate language. Perhapes you mean some intermediate representation that is used in compilers (like 3 address code, or PDA like stack oriented representation in JVM) primarily used during multi-pass optimzations. This is not exactly ideal, since it is neither highly optimized, or guarantees compatibility without some software trap to handle exceptional cases.

A VM architecture in the OS would work. But I guess the question then would be how much performance as well as (perhaps more importantly) power efficiency would you be willing to sacrifice; this is especially important in an age of ever slimmer and lighter client platforms; most of the time it would not be advisable.
By intermediate language, I mean like Java or .net's byte code. An architecture independent compiled machine code, that is compiled even further by a VM.
AFAIK, there are already components of Windows using .net byte code, and there are examples of in production operating systems (on embedded devices) running in Java Virtual Machines.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Well you are no stranger to the challenges of the design business, do you think it is at all viable for a company of Nvidia's resources and talent base to do much of anything with an x86 license in the face of the deep-pools of experience, talent, and IP portfolios at both AMD and Intel that they are putting to work into the designs they plan on selling in 2013?

I don't think the barrier to entry there is the license battle...that is a fast moving train and Nvidia would be trying to catch up to it some 30yrs after it left the station. They'd stand better chance chasing after everything that uses ARM than diving into x86 IMO.
Yeah, if NV were to start from only the resources that they have now, they would have considerable difficulty in getting out any semi-competitive product at all in the next five years; if they have access to all of the patents cross-licensed between Intel and AMD today. Not to say that NV does not have a lot of very skilled engineers, but that these engineers would not necessarily have the experience necessary to make designing a general purpose architecture a smooth process.

Designing something competitive in a reasonable amount of time takes a lot of prerequisites; and only when a party designs in a reasonable amount of time are the design elements contained within not obsolete (in terms of the rest of the competitive industry) by the time it gets to market. Most semiconductor companies can probably whip up some x86 design with relative ease, provided that the design is in order and scalar; it's usually being able to put a superscalar and OoO pipeline together with competitively low upper-bound on cycle-time that separates the men from the boys.

It takes having the personelle and institution knowledge like have mature libraries of logic blocks; it's detailed understanding in the parameters and ramifications of employing more esoteric elements like domino logic and 8T cells; it's having the hardware and software infrastructure for large scale architectural simulations; it's having highly optimized algorithms (which don't run into NP or NSPACE-complete) for things like graph-partitioning or optimal metal-layer routing; it's about having countless internal research projects that meticulously define the potential benefits and detriments of and interaction among many architectural elements.

But most of all, it's about having the key people necessary, usually senior engineers with decade long or more experience in everyone of 10's of steps in the design process from high-level arch, to logic design, to physical layout and process tech. This allows you to attain the efficiency in each step of this process, to discard the ideas that you know will not work, and to restrict yourself only to the potentially fruitful configuration of the large set of parameters to do detailed simulation and investigation. Experience really counts, and will save you a lot of time at each stage, without which the design probably would never come to the market in a timely manner to be competitive or to make a profit.

NV's most likely option would be to acquire some external expertise somehow, and perhaps get at least an entire team on board. DEC is no longer around, so they would have to look around players like VIA to see if there's any suitable foundation to build on; and of course to see if a merger with AMD would be financially feasible.
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
By intermediate language, I mean like Java or .net's byte code. An architecture independent compiled machine code, that is compiled even further by a VM.
AFAIK, there are already components of Windows using .net byte code, and there are examples of in production operating systems (on embedded devices) running in Java Virtual Machines.
OK, I know what you mean now; which is not much different than what I discussed. My previous comments would still apply.

Edit:
Although if there is hardware level I/O virtualization, then the VM scheme might have some merits; that would largely avoid most of the handicaps of such system otherwise. I guess we will have to see how soon that comes to fruition.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,361
282
126
Nice post Hard Ball.

To me it looks likely that sooner or later Intel will buy Imagination Technologies as they've been slowly acquiring shares. Too bad apple and a couple others have been scooping up shares to either block the deal or make a few bucks off it. I wonder if they're holding back because of this impending litigation.
 

at80eighty

Senior member
Jun 28, 2004
458
2
81
Hard Ball - as a guy with just an interest in the tech field, I'd like to say thank you very much for dumbing down a clearly complex process into an understandable format - the business aspects of the field fascinate me more than the microdetails of the new advancements & you made it very digestable
 

her209

No Lifer
Oct 11, 2000
56,352
9
0
I guess that rules out Intel buying nVidia anytime soon. Also, does AMD get double for CPU and graphics (when ATI was still its own entity)?
 

Dribble

Golden Member
Aug 9, 2005
1,923
456
136
Be very surpised if nvidia really cared about making x86 cpu's - chipsets yes, but cpu's - why bother?

In many ways x86 is inefficient and outdated so intel need to keep everyone onboard or people might realise there are other cpu architectures out there - intel's greatest threat is arm not nvidia. The irony might be that in blocking nvidia on x86 they push them towards arm. Nvidia then produce great products for arm which actually kills x86 market share and costs intel more money. (e.g. if you kill ion then you might find a lot less atom sales, and a lot more tegra2 based arm sales).
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
If Nvidia expects the GPU to handle everything, save running an OS, then I can see them developing a CPU similar to ARM, or maybe even Atom. Or even something simpler.
It would be great to have a third option for a platform. Intel, AMD, or Nvidia.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
If Nvidia expects the GPU to handle everything, save running an OS, then I can see them developing a CPU similar to ARM, or maybe even Atom. Or even something simpler.
It would be great to have a third option for a platform. Intel, AMD, or Nvidia.
Well, without x86, their options are limited to Linux, Windows Mobile, and OSX.

I know there are netbooks coming out running windows mobile, but that OS is still an abomination.
I could see Apple possibly doing a purpose built device (like the rumored iTablet) using Tegra.
The new DS is rumored to use Tegra, and I imagine they'll get a lot of design wins in graphics heavy mobile devices.
Linux would be great, but consumers haven't accepted it, and nvidia has little interest in it. Maybe a more hardware specific distro (like Chrome OS) could be a possibility, but I think nvidia's lack of open source drivers would cause device manufacturers to shy away. Then again, are there any open source 3d graphics processors for ARM? The only one I know of is Intel for x86.
 

Genx87

Lifer
Apr 8, 2002
41,061
494
126
Really depends on if Microsoft can be persuaded into building a version of their OS targeted for another processor. They have done it in the past for Itanium and the Alpha.

For now I think Nvidia will be in the realm of handhelds. Which isnt a bad place to be as more and more people are using them for everything they need.
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
On a philosophical level I find it wholly unjust to force a company to alter how it chooses to license its products; it is interfering in the right of free association; to enter into business agreements on their own terms. Why, exactly, does Intel *have* to.. with the full force and power of government.. license anything to Nvidia.. or to anyone else, for that matter? And don't answer that question with laws... remember, I'm talking about this philosophically. It's about what should be, not what already is.

Consumers should be the ones punishing Intel, not the government. Consumers hold all the power in a capitalist society; they vote with their wallets. Consumer apathy, idiocy, or ignorance is not the government's responsibility (or philosophically justified role) to mitigate.
 

ASK THE COMMUNITY