Intel files lawsuit against Nvidia

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I wonder if we'll ever see the day where we can just plug a graphics card into a motherboard and boot up
doubtful in a PC

it is a marriage

rose.gif
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SickBeast
Originally posted by: apoppin
doubtful

it is a marriage

rose.gif

Unless they make something like the Fusion.

The CPU is inherently better suited for some tasks than the GPU; some tasks are far better suited for the GPU

Nvidia wants to be pre-eminent in the relationship inside the PC and intel doesn't much care for the future potential shift of relevance away from the CPU

see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.

Nvidia has cornered the market on REAL TIME imaging for medical purposes .. and it is just the beginning as "CUDA" is now taught in 50 universities; it is the equivalent of x86 language for the GPU [well, more or-less - they want it to be universal]

AMD's Fusion looks to be a step over IG
intel's larrabeast looks to be a CPU emulating a GPU
:p


 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: DrMrLordX
Originally posted by: SickBeast
x8
That's some pretty great info, thanks! :)

I always thought that x86 was a basic instruction set. Should that not have been created way more than 17 years ago? If you're right on this, it should give NV the right to create an x86 processor provided they engineer it from the ground up and do not infringe on any of the more modern patents.

x86 is much older than 17 years, yes. Extensions to the instruction set were made with nearly every upgrade of processor generations throughout the years, including some baby-step evolutionary upgrades between generations. And glad to be of assistance.

Originally posted by: aka1nas
Yeah, but what do you actually get out of that? A basic 8(or even 16-bit) x86 processor can't run modern software. They would really need an x86-64 license (among others) to make something usable.

It gets you a 32-bit processor (286s, 386s, 486s, and Pentiums were all 32-bit) that could boot Windows 7, at least in theory, if not in practice. Might be slow as balls but you get the idea. If the patents behind the P6 core are now in public domain, it might enable Nvidia to reverse-engineer a Klamath-core P2 or something (or at least a Pentium Pro). I doubt they would have any legal answer to X86-64 or EM64T, but it sure would be interesting to see whether or not AMD would license those extensions to Nvidia. My guess would be no, but you never know.

Anyway they might be able to do something with that, but it would take a lot of resources they don't necessarily have, and whatever they created would probably wind up looking a lot like a Cell processor (they'd have to use multiple vector processors to make up for a lack of native SSE/SSE2/SSE3/SSE4 support, and they'd have to do instruction translation in hardware to feed SIMD instructions to the vector processors, and I don't even know if they can do that legally).

SSE2 support is a required part of x86-64, and many apps rely on its presence even on normal x86.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.

With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: alkalinetaupehat
Ya know, hot wings are so much better than popcorn for stuff like this. You can throw the bones at the loser!

True that, and unlike with popcorn, beer goes soooo much better with hotwings :beer:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: taltamir
AMD's Fusion looks to be a step over IG
AMD fusion looks like it will arrive YEARS after the intel fusion.

This.

There are only so many CPU cores that the average user can take advantage of. Even 4 cores is overkill IMO.

There is going to be so much die space that intel and AMD will be able to dedicate to the graphics cores in the "fusion" chips that it will make them at least a decent midrange GPU IMO.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.

With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.

they do have access to other foundries .. if that is *all* you are concerned about

rose.gif


AMD fusion looks like it will arrive YEARS after the intel fusion.
How do you know?
:confused:

AMD blindsided Nvidia with 4850 and 4870 series. They are keeping Fusion under wraps. Who knows if they had any breakthrough. i'd much rather think ATi's GPU engineers can teach AMD's CPU engineers how to build a CPU-GPU easier than watch Intel's "CPU only" engineers try to force a CPU to act like a GPU :p
 

DrMrLordX

Lifer
Apr 27, 2000
23,115
13,217
136
Originally posted by: taltamir

patens are NOT copy protection... if nvidia engineers it from the ground up it STILL infringes on the patent... take for example the following patent:
"a device for making of sandwiches by means of placing the ingredients on two opposing platters which are then combined by the machine"
This is a real patent, that mcdonnalds got for a burger making machine. It does not matter if someone designes a machine from the ground up that looks nothing LIKE the mcdonnalds machine it is still infringing on their PATENT.

That's not entirely correct. In the end it's up to a bunch of lawyers and a judge to decide, but you can reverse-engineer someone else's product if there's sufficient differentiation between the reverse-engineered design and the patented design being reverse-engineered (I forget exactly what standard is used, but I've heard the number 10% thrown around). Exactly what constitutes a 10% differentiation between two designs intended to accomplish the same goal, however, is not always clear.

Long story short, if you patent something, a larger competitor could throw a team of engineers and lawyers at you to end-run the patent if you won't license or sell your patent at a price agreeable to said larger competitor. Intel neatly avoids this problem by having more lawyers and better engineers than just about anyone looking to compete with them.

Originally posted by: taltamir


hell no, nothing will run on windows 95, and an 8 bit processor in this day and age is USELESS!

The chinese are interested in CPU independence, but they are unwilling to ignore international treaties for patents for it, yet... so they developed a 32bit (and now a 64bit version) chip that does NOT use the x86 instruction set, and runs a specially compiled version of linux.
Their latest version will do SOFTWARE emulation of x86 (which they claim is legal, and intel says they are looking into, but if its truely software its legal, but will have extremely poor performance).

There was some other company, they tried to make a chip that uses very long instruction words and software emulation of x86 (which again, does not infringe on patents), but it was so unimaginably slow that it failed.

Um . . . well, there's no reason why anyone, China or otherwise, needs to look outside the realm of x86 for a public-domain, 32-bit processor. Every patent governing the creation of the mighty 486dx is public domain. It is probable that patents governing Pentiums up to the P54c (the second-gen Pentium, before the Pentium MMX came out) are in the public domain.

The Chinese have been copying MIPS designs, but MIPS seems uninterested in attacking them with lawsuits so long as they do not promote the chips as being MIPS-compatible.

There's no reason in the world that China or Nvidia or someone else couldn't create a 486 knock-off. It just wouldn't be able to handle any of Intel's SIMD extensions in hardware, nor would it handle x86-64, EM64T, 3dnow!, etc etc in hardware.

Originally posted by: apoppin

Transmeta

it definitely worked .. it had several misfires and bad company decisions
- it was an "almost" .. but its technology is still used

.. but now maybe the GPU can do it much faster
- if so, Intel has much to fear
:clock:

I had forgotten about them until after I posted. There are (or were) numerous odd x86 licenses floating around back in the day, plus Transmeta with their on-chip x86 emulation.

VIA is still capable of producing x86 processors thanks to their buyouts of Cyrix and Centaur, though how far their license goes is anyone's guess.

Originally posted by: SickBeast
Well then...it looks to me like NV will have better graphics performance, but intel will have better x86 performance. It's interesting. NV is emulating x86, whereas intel is sort of emulating a GPU.

I look forward to NV releasing something x86-compatible. I wonder if we'll ever see the day where we can just plug a graphics card into a motherboard and boot up - no CPU or memory required! :)

The thing Nvidia has to look at is the raw cost of designing a CPU that will run well executing code designed for Intel processors. There's more involved than just the instruction set. Yes, Nvidia probably could implement the core IA32 instruction set in hardware and use software emulation for MMX through SSE4 (and 3dnow!, whoopee) along with any other wonky extensions that have been tacked on over the years.

The real issue is actual CPU design, since they can't just rip off any modern Intel design. They could produce their own chip with sophisticated cache architecture, deep pipelines, multiple execution units, etc etc but how long would that take and how much would it cost them?

If Nvidia could ever figure out speculative threading, though, they could do frightening things. So could AMD for that matter.

Originally posted by: Fox5

SSE2 support is a required part of x86-64, and many apps rely on its presence even on normal x86.

Not surprising, though it still means you could do a lot without SSE2 and x86-64 support in hardware (or in software-on-chip emulation). I know Windows XP can boot on a pre-SSE (and SSE2) machine. What you could do after that would be less-than-impressive, of course. Vista allegedly has booted on a Pentium II before (pre-SSE and SSE2) so it stands to reason that it, and Windows 7, could boot on a processor that was 100% compatible with the old p54c, or at least the p55c (Pentium MMX).

Originally posted by: Idontcare

With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.

Yarr, too true. It would take a major breakthrough in CPU design for Nvidia to compete, namely something that would allow them to apply the raw power of their Stream processors to relatively non-paralellized code. If Intel figures that out first, then nevermind about that.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
AMD fusion looks like it will arrive YEARS after the intel fusion.
How do you know?
:confused:

AMD blindsided Nvidia with 4850 and 4870 series. They are keeping Fusion under wraps. Who knows if they had any breakthrough. i'd much rather think ATi's GPU engineers can teach AMD's CPU engineers how to build a CPU-GPU easier than watch Intel's "CPU only" engineers try to force a CPU to act like a GPU :p

Because AMD keeps on pushing it back, intel on the other hand has it on the charts, and now it is on the charts ahead of AMD, and they are more punctual... sure they announced it later, but they will end up making it first.

Kinda like hybrid power (note, not hyprid SLI)... AMD announced it first, nvidia announced it second, nvidia has had it for a while... AMD dropped it completely and never implemented it.

Ofcourse AMD CAN theoretically get it first, but notice I said "LOOKS LIKE".
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.

LOL, nVidia shareholders should be worried about this guy.


AMD and Intel are going to integrate decent GPUs with their CPUs, so the stand-alone GPU market is only going to get smaller in the future.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Spicedaddy
AMD and Intel are going to integrate decent GPUs with their CPUs, so the stand-alone GPU market is only going to get smaller in the future.

Im not so sure about that :p

You need to realise that these cpus based on the "fusion" concept will carry GPUs equivalent (in performance) to current day IGPs. I can tell you that it will be a very long time before, we see something along the performance levels of todays high end GPU integrated to a cpu.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

VIA is still capable of producing x86 processors thanks to their buyouts of Cyrix and Centaur, though how far their license goes is anyone's guess.
the original suggestion was that Nvidia buy VIA and sell it off to get at the x86 license held by SIS; then they would have Nvidia Graphics and SIS CPUs as a division of Nvidia

but then do they want their own x86 or do they want to develop the GPU further so it is indispensable to every PC ?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.

With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.

they do have access to other foundries .. if that is *all* you are concerned about

rose.gif

I personally worked with *all* the foundries, I use TSMC specifically as my example here as they are best-case example for leading-edge one can access thru the foundries at this time.

All other foundries lad behind TSMC's process capability and timeline. (with good reason, TSMC revenue dwarfs all other foundry players, TSMC is to the foundry world as Intel is to the logic MPU world)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AMD is divesting themselves from their founderies which they will open to anyone. They hope to compete with TSMC. The saudi arabia government will buy 51% of the stock in this new company while AMD will own 49%, it will bring in 6 billion dollars from saudi arabia to help fund foundry upgrades, which will then be avilable to AMD and any of its competitors who wishes to use the foundries (currently AMDs foundries spend some of their time idle for some odd reason, has to do with mismanagement).

Will it successfully compete with TSMC? maybe... will it potentially sour AMDs relationship with TSMC (who makes their video cards), likely. Will nvidia choose to use it over TSMC? unlikely.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
You speak as though you have a crystal ball :p

Mine says "yes" .. using AMD's newest state of the art foundries to encourage Nvidia to ALSO use them for their cutting edge HW
- TSMC will get over it and will keep Nvidia's mainstream production [which should increase if they fulfill their vision of "Nvidia Inside"]

rose.gif




 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Crystal balls are too fragile, mine are made of brass. And their prediction are never wrong :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
well, isn't AMD building *brand new* foundries that they plan to open also to their competitors?
:confused:

You are thinking fabs, not foundries. AMD is turning over control of their existing fabs to a foundry (TFC, the foundry company) and as such TFC will open its doors to new customers at 32nm in 2H 2010. This is in addition to the plans TFC has for opening a new fab in NY.

There is a serious challenge here for TFC's business model as their existing knowledge base for processing technology in the FEOL is SOI-based, not bulk.

No doubt TFC will be aggressively trying to sign customers for their 32nm bulk CMOS process tech but its not like they have recent history of manufacturing on advanced node bulk Si to give confidence to would-be TSMC/UMC/Chartered/IBM customers that going with the foundry company is a risk-acceptable maneuver for a fabless company at this point in time.

(and this is the point in time when customers would need to be designing chips for 32nm production if they were going to be)

All I am pointing out is in regards to Intel is that I'm just saying the last thing in the world a fabless company wants to do is to compete against Intel while relying on TSMC (as best case example) for their access to process tech. Being able to rely on TFC is going to be a little bit better, but the gap between Intel and TFC is not going to magically change just because AMD sold their fabs to TFC.

This wasn't an issue in the GPU world where everyone was fabless, so no one GPU company had any more a leg-up over their competitor when it came to process tech. Success came down to management timing the adoption of new process tech at the foundries versus their design teams making the most of the xtor parameters delivered by the foundries.

Intel changes this because they aren't just coming at NV and AMD GPU's with their own design, they are coming at them with their leading-edge process tech...that's the angle that can make a crappy design still win the revenue. Who is going to be producing 16nm chips with $100m EUV tools first? TSMC? TFC? or Intel?

Not a foot-race I'd want my fabless company to be in, and these guys know it, I'm not saying anything here that they haven't already contemplated eons ago.
 

mrSHEiK124

Lifer
Mar 6, 2004
11,488
2
0
Originally posted by: aigomorla
Originally posted by: taltamir
Crap? they were vastly superior most of the time, .

how?

Nvidia was superior to the P35, P45, X38 and X48?
Please u best take that statement back.

No they werent vastly superior for its time.

More like problem prone compared to the other chipsets.

And yes ive owned the 680 780 and 790 for a short time. It was crap compared to my DFI LT X38 and X48 and lets not get started on how badly the UD3P will kill Nvidia boards.

I can only think of two nVIDIA chipsets that were "vastly superior." nF2 Ultra 400 on Socket A, because VIA's chipset sucked, and nF4 Ultra/SLI on Socket 939, because VIA's chipset sucked. nF3 had crippled HT, nF1 was before my time. Then AMD/ATI woke up and figured out that since they make the damn processors, there's no reason they shouldn't be making the best possible chipset.

nVIDIA's Intel chipsets have sucked balls all along. I really don't want Intel to win the suit because then X58 (and successors) have no potential competition = prices+++. Remember, if AMD wasn't around in the 90s, we MIGHT be using 1 GHz P2s right now.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
change that from might to would. Only thing keeping intel innovating is the competition.
 

DrMrLordX

Lifer
Apr 27, 2000
23,115
13,217
136
Originally posted by: apoppin

the original suggestion was that Nvidia buy VIA and sell it off to get at the x86 license held by SIS; then they would have Nvidia Graphics and SIS CPUs as a division of Nvidia

but then do they want their own x86 or do they want to develop the GPU further so it is indispensable to every PC ?

Well, here's the thing: x86, along with its many extensions, is just an instruction set. What hardware you have on the "back end" interpreting the instructions and carrying them out is up to you. Nvidia has to make their hardware work with current and future x86 apps regardless of how they intend to execute those instructions. And, it's going to need to be fast enough to be worth the end-user's while.

Nvidia could sell their own barebones x86 processors and use them just to boot into Windows Vista/7/etc. far enough to let a bunch of CUDA apps take over and pass the majority of computing tasks along to Stream-based GPU cards plugged into the motherboard.

Or, Nvidia could try to pass x86 instructions along to Stream-based GPU cards or GPU/CPU hybrid chips in hopes of running existing apps instead of bullying people into messing with CUDA.

The former I see as being more probable. Nvidia's apparent strategy revolves around the increasing trend towards parallelized code, which is theoretically to their advantage, since their entire business model revolves around execution of highly-parallelized workloads. All they need (or think they need) is an x86 frontend processor to boot a popular OS; if they can get the majority of real apps compiled to take advantage of their existing hardware instead of x86 hardware, they can end-run Intel by selling their own crappy-arsed x86 CPU boards capable of hosting multiple GPU cards chock full of Stream processors.

The problem with such a strategy is that the trend towards workload parellilization may not go as quickly or as thoroughly as Nvidia wants. There's no telling how well apps we use on an everyday basis could be paralellized. Generally speaking, "single user" apps will use no more than 8 threads on existing machines one could reasonably expect to encounter in a home/small office environment, and that's assuming you run into a Skulltrail or Core i7 system.

Realistically speaking, your average PC app is going to spawn two threads in serious need of CPU time. In time that number will go up to 4. Nvidia already sells cards with GPUs sporting 480 Stream processors. How many threads do you think Nvidia would need an app to spawn before utilizing their hardware would start to make sense? The answer to that question will govern how readily software developers will flock to CUDA.

The latter strategy . . . well I think it could work out okay with SIMD instructions but anything else would be crap without some kind of speculative threading.

 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
I'm gonna sit back and enjoy the show here.You guys got me lost on alot of this stuff since I was never this into everything.
All I can think of is that nvidia cannot afford to battle with intel.Sure intel is trying to get to where they dont need nvidia and I think the time is coming soon.