Nvidia announces x86 chip *edit: not true*

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,111
3,635
126
Originally posted by: SlowSpyder
So CPU's are important after all?

only if your not a pure gamer.

then again if you are, Xbox would be a better investment or ps3.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1


Intel has done nothing but keep promisies for the last 3 years period

that does not mean that Larrabeast will not be a flop :p
- it may succeed in the low end .. where intel's IG has been popular but weak

that is what Nvidia apparently believes if you would just LISTEN to the conference call

and how does this Sound to you?

Nvidia Graphics

SiS CPUs; a *division* of Nvidia Graphics

rose.gif


not so good, huh?
:confused:




 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
How much trouble would it cause / is this even possible -

for nVidia to create (or farm out development of) a non x86 OS, heavily multithreaded, that would run on CUDA hardware?

Sorry if this is a stoopid question but I'm just wondering...

EDIT: Just for the record, this would give an end-run around the monopolistic M$ & Intel simultaneously - I love the idea...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Denithor
for nVidia to create (or farm out development of) a non x86 OS, heavily multithreaded, that would run on CUDA hardware?

Like solaris but for Cuda?

The OS operating on a multi-threaded CUDA system is not so much the challenge as NV controls the hardware and they could craft their OS to work well with the hardware.

Rather the challenge is getting apps to work well (enough) on CUDA in the OS that NV would roll.

You need compilers that are good and you need a programming model that makes it really easy for would be app developers to invest the time and effort to make the apps.

It's feasible, but its not going to happen overnight.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: apoppin
Originally posted by: Nemesis 1


Intel has done nothing but keep promisies for the last 3 years period

that does not mean that Larrabeast will not be a flop :p
- it may succeed in the low end .. where intel's IG has been popular but weak

that is what Nvidia apparently believes if you would just LISTEN to the conference call

and how does this Sound to you?

Nvidia Graphics

SiS CPUs; a *division* of Nvidia Graphics

rose.gif


not so good, huh?
:confused:

So very true. But What If its better than we been told by alot. Than What?

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Nemesis 1
Originally posted by: taltamir
the only one worth responding to is this:
Wheres the link. To contract. That says If ULi or Via sellout. They can transferr X86 license

They actually can't, which is why nvidia buys them, and then transfers nvidia staff to it while maintaining it as a seperate company. And making a contract with itself (aka, via under nvidia ownership makes a contract with nvidia) for future codevelopment.

Ok. Nv bought ULi. Didn't that cancel the contract x86?If not whats to stop AMd from doing same with ATi. Amd maintains control . The 2 companies work together. One on servers the other on desk netbooks ect. ect.
I will look for the contract.

how is AMD splitting off ATI and giving it an X86 license the same as nvidia buying an x86 license holder, keeping it seperate, but making collaboration works between the two?

Sure AMD could have ATI personel develop the CPU, as long as it is AMD who holds the rights and AMD that sells it... But that would not be an ATI CPU...
assuming there is no secrecy clause in the x86 contract with intel that prohibits such actions that is...

There is also the issue of who owns who here, nvidia is discussed as potentially owning sis, ATI can't own AMD, because AMD owns ATI...

I mean intel is suing nvidia on the claim that there is a clause prohibiting nvidia from stating that they have a license to make an i7 compatible mobo and based on that the contract should be nulled (and nvidia barred from selling any intel compatible boards at all)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Originally posted by: apoppin
Originally posted by: Nemesis 1


Intel has done nothing but keep promisies for the last 3 years period

that does not mean that Larrabeast will not be a flop :p
- it may succeed in the low end .. where intel's IG has been popular but weak

that is what Nvidia apparently believes if you would just LISTEN to the conference call

and how does this Sound to you?

Nvidia Graphics

SiS CPUs; a *division* of Nvidia Graphics

rose.gif


not so good, huh?
:confused:

So very true. But What If its better than we been told by alot. Than What?

you have to remember that Nvidia and AMD graphics are not sleeping. They are going to keep raising the bar higher and higher for graphics performance ,, and intel has ONE SHOT to get Larrabeast "right" on the benchmarks

if it is "low end", Nvidia "wins"

period

if it is midrange .. intel needs more cores .. and then it gets expensive for them

NOW let's SAY it is *great* .. Larrabeast STILL has to drag all of x86 overhead AND perform flawlessly

i am betting against intel and for Nvidia here
rose.gif

You need compilers that are good and you need a programming model that makes it really easy for would be app developers to invest the time and effort to make the apps.

It's feasible, but its not going to happen overnight.

Unless i am mistaken, that is ALSO what intel needs for Larrabeast :p
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
You need compilers that are good and you need a programming model that makes it really easy for would be app developers to invest the time and effort to make the apps.

It's feasible, but its not going to happen overnight.

Unless i am mistaken, that is ALSO what intel needs for Larrabeast :p

Yep. And they haven't proven themselves overly capable in this regard.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well. I do believe that in the very near future. Were all going to find out just how good Boris really is.

I am of the mind . If time permitted it . He would have been a legand. Some say he already is.

Your about to find out if Intels Compilers are all that. Because when you see Larrabe working with Nehalem . The Cpu will No longer be considered unuseable In a Video system . It will shock most. But the real shocker comes with sandy and AVx.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Idontcare
Originally posted by: apoppin
You need compilers that are good and you need a programming model that makes it really easy for would be app developers to invest the time and effort to make the apps.

It's feasible, but its not going to happen overnight.

Unless i am mistaken, that is ALSO what intel needs for Larrabeast :p

Yep. And they haven't proven themselves overly capable in this regard.

Intel's x86 compilers are actually far better than Microsoft's compilers in nearly all regards. Nvidia has some experience with compilers as well... see CUDA and their HLSL compilers - some of the best in the graphics business.
 

chronodekar

Senior member
Nov 2, 2008
721
1
0
When nVidia says that they are considering to be in the processor field in 2 to 3 years, can it be they are waiting for the x86 license to be nullified?

The x86 was introduced sometime around the late 1980's. The usual period for a copy-right is 15 years, correct? So, after that time period, legally anyone should be allowed to copy x86?

I am aware that more than 15 years has passed, but intel STILL controls the x86 license. Sooo... what am I missing here? (or am I mixing things up?)


And 2-3years sounds like standard development time to me. So, I'd say that nVidia is going to develop their own x86 variant (ignoring the licensing issue) rather than buy anyone.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I don't think it was up to AMD to decide whether a spun-off ATi would be allowed to build x86 chips. Isn't that Intel's decision?

I think AMD might get veto power in that as well. They're more or less co-partners in x86, though Intel is the dominant partner. (and remember, x86-64 is AMD)

Didn't nvidia buy someone with an x86 license, or transmeta, a while back?

If not, they can just wait until AMD finishes its death spiral and pick up the designs, then depend on the EU to want an Intel competitor around. Even the old single core athlon 64 is a better design than what VIA is offering, and likely AMD's designs will remain good enough for the low-end market for years.

How much trouble would it cause / is this even possible -

for nVidia to create (or farm out development of) a non x86 OS, heavily multithreaded, that would run on CUDA hardware?

Sorry if this is a stoopid question but I'm just wondering...

EDIT: Just for the record, this would give an end-run around the monopolistic M$ & Intel simultaneously - I love the idea...

They could build a version of Linux to do it. Though CUDA already runs in Linux, it just needs software support.
If it had a justifiable advantage, Microsoft may even be forced to include something similar in Windows.
But I don't know if nvidia would want to make an enemy of Microsoft.

The x86 was introduced sometime around the late 1980's. The usual period for a copy-right is 15 years, correct? So, after that time period, legally anyone should be allowed to copy x86?

I am aware that more than 15 years has passed, but intel STILL controls the x86 license. Sooo... what am I missing here? (or am I mixing things up?)

The problem isn't so much x86, it's x86 + extensions. 32 bit may not be open yet. 64 bit definitely isn't. And MMX, SSE, SSE2, and SSE3 support are basically required these days to have full app compatibility.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: SunnyD
Originally posted by: Idontcare
Originally posted by: apoppin
You need compilers that are good and you need a programming model that makes it really easy for would be app developers to invest the time and effort to make the apps.

It's feasible, but its not going to happen overnight.

Unless i am mistaken, that is ALSO what intel needs for Larrabeast :p

Yep. And they haven't proven themselves overly capable in this regard.

Intel's x86 compilers are actually far better than Microsoft's compilers in nearly all regards. Nvidia has some experience with compilers as well... see CUDA and their HLSL compilers - some of the best in the graphics business.

it also isn't just about "how good", is it ?

there is the issue of industry-wide acceptance - especially in PC gaming - that will ultimately make or break them rather quickly, as i understand it
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: chronodekar
When nVidia says that they are considering to be in the processor field in 2 to 3 years, can it be they are waiting for the x86 license to be nullified?

The x86 was introduced sometime around the late 1980's. The usual period for a copy-right is 15 years, correct? So, after that time period, legally anyone should be allowed to copy x86?

I am aware that more than 15 years has passed, but intel STILL controls the x86 license. Sooo... what am I missing here? (or am I mixing things up?)


And 2-3years sounds like standard development time to me. So, I'd say that nVidia is going to develop their own x86 variant (ignoring the licensing issue) rather than buy anyone.

I believe their is some trueth to what your saying. Thats the probable reason Intel is Going to a new arch. With Sandy. Now as I understand it. Sandy is a X86 capable chip . But Intel infact is reinventing it self. Sandy Will be the New Intel AVX with x86 ported to AVX . Thats my understanding. AVX is Intels new Shield.

 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Nemesis 1
Originally posted by: chronodekar
When nVidia says that they are considering to be in the processor field in 2 to 3 years, can it be they are waiting for the x86 license to be nullified?

The x86 was introduced sometime around the late 1980's. The usual period for a copy-right is 15 years, correct? So, after that time period, legally anyone should be allowed to copy x86?

I am aware that more than 15 years has passed, but intel STILL controls the x86 license. Sooo... what am I missing here? (or am I mixing things up?)


And 2-3years sounds like standard development time to me. So, I'd say that nVidia is going to develop their own x86 variant (ignoring the licensing issue) rather than buy anyone.

I believe their is some trueth to what your saying. Thats the probable reason Intel is Going to a new arch. With Sandy. Now as I understand it. Sandy is a X86 capable chip . But Intel infact is reinventing it self. Sandy Will be the New Intel AVX with x86 ported to AVX . Thats my understanding. AVX is Intels new Shield.

Nemesis, as you have educated me on in the past, I realize the russians are involved in making/improving the compilers for EPIC (Itanium) and that Itanium currently emulates x86 compatibility via software translation, and future Itaniums like Tukwila will operate on the same QPI/DMI bus as the x86-based platforms...so where do you see Sandy and AVX fitting into this backdrop?
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
I highly doubt one has to have a license to use the IP for the 386 chip. IIRC, that's all public domain at this point.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Are you guys forgetting x86-64bit or AMD64?
:confused:

AMD developed this separately from Intel and i believe intel licenses it from AMD, and that license is not exclusive, VIA CPU's are also 64 bit capable. :p

The question is has AMD made AMD64 "freely accessible/open standard" like they do most of their other stuff such as HyperTransport?

If so, as long as Nvidia is prepared to give up native 32-bit compatibility - which they could emulate as they just happened to hold a Transmeta license - they can skip x86 for x86-64-bit

Perhaps there is no requirement at all to make any agreement with intel for anything except bus licenses.
rose.gif


x86 may not be so important in this new world Jensen envisions
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I think not . AVX is a new ISA.

ISA is the part of an overall computer architecture related to programming. The
set includes the native data types, instructions, registers, addressing modes,
memory architecture, interrupt and exception handling, and external I/O. The ISA
defines a specification of functions (machine commands) implemented by a
particular microprocessor design. Within a family of processors, the ISA is often
enhanced over time with new instructions to deliver better performance and
expose new features while at the same time maintaining compatibility with
existing applications.
? Microarchitecture refers to the actual design, layout, and implementation of an
ISA in silicon. It includes the overall block design, cores, execution units and
types (such as floating-point, integer, branch prediction, SIMD, etc), pipeline
definitions, cache memory design, and peripheral support. Within a family of
processors, the microarchitecture is often enhanced over time to deliver
improvements in performance, energy efficiency, and capabilities, while
maintaining compatibility to ISA.
Leading the Instruction Set Revolution ? A Long History in
ISA
Intel uses ISA to deliver the superior capabilities of its microarchitecture while
maintaining the necessary application-level compatibility across processor generations.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Nemesis 1
Read this . Its of great interest. Understand AVX is NEW. Its exclusive to Intel.

In hear you shall see it plainly written that X86 is ported to AVX. I can't do better than this. Draw yourown conclusions.

http://softwarecommunity.intel...gy%20Efficiency_WP.pdf

Nemesis that was a delight to read. Thank you very much for taking the time to post the link, I'm sure I could have found it with some judicious google work but I probably would have never taken the time.

I see exactly what you mean now, they specifically discuss porting some 300+ existing instructions into the AVX ISA. That is quite a re-working over of the existing ISA.

Is AVX expected to be incorporated into Larrabee at some point?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Not that I am aware. But My understanding is the compilar is something really special. TO work with Larrabee ISA. They both have 256bit vector units. But are differant.
 

theeedude

Lifer
Feb 5, 2006
35,787
6,197
126
Adding ISA instructions are a 1 way street, because your future products have to support them even if you find out few users actually compile to take advantage of them, so you end up dedicating processor area, power, complexity, and engineering design resources to support something that may not be used much at all. And a lot of those vector instructions are already supported by the GPU, so you end up replicating a lot of resources between the CPU and GPU.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: senseamp
Adding ISA instructions are a 1 way street, because your future products have to support them even if you find out few users actually compile to take advantage of them, so you end up dedicating processor area, power, complexity, and engineering design resources to support something that may not be used much at all. And a lot of those vector instructions are already supported by the GPU, so you end up replicating a lot of resources between the CPU and GPU.

Isn't that only a requirement in the decode stage? An unused instruction could always be hardware emulated by a combination of much slower instructions.

If Intel for example found that basically no applications were code to use the FCOS80 instruction then they could opt to not implement it in an optimized/speedy fashion in future microarchitectures.

They just need maintain the ability of being ISA compatible, which does not require the microarchitecture to have dedicated hardware capability of executing the instruction.

I'm no expert though, this is just my perception of the ISA/microarchitecture backwards compatibility situation.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: apoppin
If so, as long as Nvidia is prepared to give up native 32-bit compatibility - which they could emulate as they just happened to hold a Transmeta license - they can skip x86 for x86-64-bit

Perhaps there is no requirement at all to make any agreement with intel for anything except bus licenses.
rose.gif

Nvidia licensed LongRun2 from Transmeta (power management), not core CPU architecture licenses/code morphing.

There's a reason AMD went with the EV7 bus and then designed HyperTransport... they didn't want to pay royalties to Intel for both x86 as well as GTL+ anymore. The bus license was more lucrative to Intel's royalty margins... and in fact it's the entire reason VIA got into hot water with it's Pentium 4 chipsets - they stuck SDRAM on GTL+ when Intel was desperately trying to push RDRAM. If nvidia didn't care about being socket compatible with anybody (just like AMD decided with the Athlon), the bus license becomes moot. They still have to worry about x86 though (or particularly all the MMX variants).

I also have this distinct feeling that AMD may have made a promise behind closed doors to Intel not to give x86-64 to nvidia in exchange (or to comfort regarding to) "blessing" the foundry spinoff. AMD itself wouldn't want nvidia in the CPU game anyway for multiple reasons. Besides, x86-64 doesn't fit in very well with the MID/SOC plans nvidia has anyway.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Nvidia licensed LongRun2 from Transmeta (power management), not core CPU architecture licenses/code morphing.

Not so, SunnyD

http://news.cnet.com/nanotech/?keyword=Transmeta
http://www.deviceguru.com/transmeta-seeks-buyer/

In August of this year, video-chip powerhouse Nvidia paid Transmeta $25 million for a ?non-exclusive license to Transmeta?s Long Run and LongRun2 technologies and other intellectual property? for incorporation into future Nvidia chips.

and worth reading:
http://www.computerworlduk.com...ndex.cfm?articleid=225
According to the complaint, Intel and Transmeta were working together until a dustup over the value of Transmeta?s IP (intellectual property). Transmeta says that Intel folded up its chequebook but kept using Transmeta IP in Intel designs. Having just declared itself the new papa of the green x86, it looks bad that Intel might have let a little of Transmeta?s 1-watt 32-bit x86 leak into Intel?s performance-per-watt chips.
.... In 1995, Transmeta set out to create a metaprocessor, a CPU that could assume the personality of another. Transmeta first created Crusoe, a uniquely flexible CPU with a native VLIW (very long instruction word) architecture. Itanium is another VLIW design, but as opposed to Intel, Transmeta never required developers to code to its CPUs? native architectures. Instead, Transmeta wrote Code Morphing software to translate x86 instructions into native VLIW operations on the fly. Any 32-bit x86 software you choose runs, unmodified, on a Transmeta CPU. Translated code is cached, so Transmeta processors -- the current being Efficeon -- speed up as they learn the instruction mix of your applications.

The Code Morphing software not only translates x86 code to VLIW in real time; it analyses the code it?s translating and makes fine-grained adjustments to CPU voltage and clock frequency based on performance demands and thermal conditions. It?s key that Efficeon doesn?t rely on the OS to measure load and change speed and voltage. Efficeon and Code Morphing measure and adjust to demand by themselves. Transmeta calls this LongRun, and LongRun2 pushes power-saving technology further by greatly reducing the amount of current that transistors leak while they?re in the ?off? state.
Nvidia has this now

Now do you see why intel is panicking and suing Nvidia .. and Nvidia says, "screw you" we can do better than you can with our own CPU; - 1-3 years time frame :p
- it is a huge slap in intel's face and designed to throw them completely off balance
- imo .. of course, it is would i would do if i was gambling

rose.gif