Here it is, right here - NVIDIA's entry into the x86 CPU market

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Idontcare
Originally posted by: SunnyD
The point is that whomever acquires this part of Transmeta gets a license to make x86 compatible CPUs if they so desire.

Is that a "known" or is that a "presumed"?

I have been operating under the assumption that Intel structured all their x86 license agreements to be non-transferable including merger and acquisition of the license holder (transmeta in this case) by another corporation (Nvidia in this case).

But if this assumption is incorrect, and only AMD's x86 license carries such restrictions then I would dearly love to know this.

Here's an interesting end around thought... several years ago Transmeta licensed x86-64 (AMD64) from AMD, which AMD64 has full backwards compatibility with x86-32 (IA32).

Transmeta also at the very least USED to produce processors that supported x86 (Crusoe & Efficion). So saying that their license doesn't allow them to make processors that support x86 (regardless of implementation) is somewhat ... misleading.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: SunnyD
Originally posted by: Idontcare
Originally posted by: SunnyD
The point is that whomever acquires this part of Transmeta gets a license to make x86 compatible CPUs if they so desire.

Is that a "known" or is that a "presumed"?

I have been operating under the assumption that Intel structured all their x86 license agreements to be non-transferable including merger and acquisition of the license holder (transmeta in this case) by another corporation (Nvidia in this case).

But if this assumption is incorrect, and only AMD's x86 license carries such restrictions then I would dearly love to know this.

Here's an interesting end around thought... several years ago Transmeta licensed x86-64 (AMD64) from AMD, which AMD64 has full backwards compatibility with x86-32 (IA32).

Transmeta also at the very least USED to produce processors that supported x86 (Crusoe & Efficion). So saying that their license doesn't allow them to make processors that support x86 (regardless of implementation) is somewhat ... misleading.

:confused:

You've got me completely baffled how that has anything to do with my post.

Did you mean to quote me in your post, is your post an effort to address the question in my post?
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: Idontcare

The harbinger of yet another industry titan reaching an apogee in their career trajectory is the foretelling of demise of anything related to their industry.

All this well-respected individual just did was put a time-stamp on when his career jumped the shark.

Sweeny isn't the first, and won't be the last, to make lousy claims of the imminent demise of some facet of the industry. We push titans and icons to make these kinds of grand-sweeping statements about the changes the future will bring.

Remember the lists that Bill Gates use to publish regarding how he felt the world will be different in 10 yrs? And they nearly always turned out to be rubbish?

Here's my favorite graph of the value of projections versus reality:

Itanium Sales: Forecasts versus Actual

You don't agree that what Sweeney is saying makes a lot of sense? :confused:

I'll put it this way: How many games use the Unreal engine is some form or another?
And if you know Epic Games, you know how closely they work with Intel.
I'm not saying what he says will come to pass for certain, but if he wants to go that direction, he will when the opportunity arises.

BTW, this is the real article: http://arstechnica.com/article...sweeney-interview.ars/

Xbit just rehashed it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: n7
You don't agree that what Sweeney is saying makes a lot of sense? :confused:

I'm not saying that in the least. Of course what he says makes sense to us and to him given the known data points, boundary conditions and initial conditions

He's a rational man using his logic to make a justifiable statement about the perceived outcome of the current trajectory his industry is in.

Were he to make any other statement then he'd be brandished a fool as no other statement would be justifiable unless he presented evidence in his argument to alter the boundary or initial conditions we perceive his industry to be beholden to.

This is why "stating the obvious" tends to be what brainy people do all the time. If they stated something that defied the obvious then they wouldn't be considered brainy.

In a round-about way he's saying he can't rationalize a way to justify how the future won't come to represent the end result of its current trajectory. Fair enough, I'd be disappointed if he couldn't pass this bare minimum self-consistency test.

But what I am saying is this is perhaps the easiest way for us non-industry titans to know what exactly is NOT going to happen. We don't know what the future of graphics holds for us or the market players, but I've yet to see a single-instance where an industry titan says "in 3-5 yrs, this will come to pass" and in fact it does.

So my rule of thumb when it comes to these things is this: if there were 1000 possible outcomes and an industry titan (Moore, Sweeney, Kilby, etc) says "XYZ is going to happen" then first we can now safely conclude XYZ most certainly is NOT going to happen and secondly the pool of possibilities has just decreased by one, so now we are left with 999 possible outcomes (so there was value added by the industry titan after all).

Got any Rambus RDRAM memory in your desktop?

I don't expect anyone to agree with my philosophy here, just sharing to explain why I agree that Sweeney makes sense but at the same time I take his logic as all the proof I need to conclude it is exactly what will not be happening.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Fair enough...

I'd say this is a time will tell situation of course ;)

I just don't see rasterization as being used forever though...i suspect the gaming graphics industry is going to see some shifts in things in the next while.

Anyway, this topic was supposed to be CPUs not GPUs though sorry...i'm going off a bit.