nVidia entering the x86 scene?

Cogman

Lifer
Sep 19, 2000
10,286
145
106
http://www.pcauthority.com.au/...-make-an-x86-chip.aspx

I would be surprised if they can get past the 1000's of patent suits that will fly their way, but if they pull it off, it might be interesting to have another competitor in the desktop market.

Someone pointed out at slashdot that at least parts of the x86 license should be coming into public domain (as patents only last 20 years). So who knows, maybe nVidia will come out with some instruction set or design that Intel and AMD HAVE to have that will force them into negotiations.

Or maybe nVidia will only be developing small form factor CPUs. Who knows? Either way, this is interesting news indeed.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I'd love to see more than Intel, AMD, and Via in the x86 arena. Preferably, with the x86 license nearly open.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
That was probably the dumbest "article" I have ever read. Whoever wrote that seems to have no idea what he/she is talking about.

That being said, it will be interesting to see if this rumor has any truth to it. It would be great to have another player in the mainstream CPU business!
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91

First off this pc authority site is just blatantly ripping off content from The Inq. They should not be linked in the slightest. As much as I dislike TheINQ it really disgusts me when people intentionally pirate webcontent. At least places like HOCP only clip a portion of the article but then direct the hits to the originator of the content if the reader wants to finish reading the story.

http://www.theinquirer.net/inq...nvidia-trying-x86-chip

Second of all this is a Charlie Demerjian article/rant. Groo has developed the nastiest of reputations and utter lack of credibility (i.e. zero journalistic integrity, pure rabid anti-NV bias has taken place)...so anytime we see Groo writing another NV-related article we really should assume that it is just Charlie trying to piss on Nvidia in a new way while collecting a paycheck.

Basically the last time Charlie was right about anything, seriously think about it, was, oh shoot, I can't think of the last time he actually added any value to the web, TheINQ, etc.

If FUD had an article on it I would consider it plausible. If TheINQ had an article on it which was not authored by Charlie I'd think it possible but probably not plausible. If Charlie writes about it and it relates to Intel, NV, AMD, or ATi then it is guaranteed to be a complete work of fiction. (caveat: If Charlie writes about it and Sony denies it then I consider it 100% confirmed) Charlie's record has created this correlation, he's only got himself to thank for it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Martimus
That was probably the dumbest "article" I have ever read. Whoever wrote that seems to have no idea what he/she is talking about.

That being said, it will be interesting to see if this rumor has any truth to it. It would be great to have another player in the mainstream CPU business!

in Jensen's (Jen-Hsun Huang, CEO Nvidia) latest 38-minute interview with Charlie Rose this last Friday night on PBS, he said NO

Regarding Intel, he said, "there is no need to reinvent the CPU" and Jensen actually said the core of the ?tension? between Nvidia and Intel ?is the battle for the soul of the PC?.
:Q

Nvidia wants the GPU to become *at least as important* as the CPU as the world moves to cloud computing to compliment client computing
[or supplant it, more likely]

it is an awesome interview about Nvidia's "intuition" for the future of expanding computing; one of Rose's best tech interviews
rose.gif

 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
Originally posted by: Martimus
That was probably the dumbest "article" I have ever read. Whoever wrote that seems to have no idea what he/she is talking about.

That being said, it will be interesting to see if this rumor has any truth to it. It would be great to have another player in the mainstream CPU business!

in Jensen's (Jen-Hsun Huang, CEO Nvidia) latest 38-minute interview with Charlie Rose this last Friday night on PBS, he said NO

Regarding Intel, he said, "there is no need to reinvent the CPU" and Jensen actually said the core of the ?tension? between Nvidia and Intel ?is the battle for the soul of the PC?.
:Q

Nvidia wants the GPU to become *at least as important* as the CPU as the world moves to cloud computing to compliment client computing
[or supplant it, more likely]

it is an awesome interview about Nvidia's "intuition" for the future of expanding computing; one of Rose's best tech interviews
rose.gif

While it is obvious why Jensen would say this it is also obvious why they can't really be believed when they say this.

There once was a world where CPU makers co-existed with Chipset makers (Ali, NV, Via, Ati etc)...Jensen watched that world collapse as the CPU makers wanted larger and larger pieces of the chipset pie.

In the world where CPU maker co-existed with discreet GPU makers it seemed like there was enough sales dollars to go around. But once AMD decided to incorporate ATi into their business model, and Intel elected to go after the discreet GPU with their ground-up Larrabee business model, Jensen and Nvidia would be the fools in this scenario to not see where the industry is headed.

They will become the next Ali of the GPU world if they do not get ahead of the coming transition, as AMD did. But of course they'd be a fool to tip this hand as well, so it is completely understandable why Jensen says what he says. But we'd be the fools if we believed what he said.

Nvidia simply has no choice. If they want to survive they must diversify their business model at least as quickly as Intel and AMD, and given that AMD and Intel are much larger than Nvidia I would argue that it is imperative for Nvidia to diversify their business model at a rate that is much faster than Intel's and AMD's.

Oh they could scale-down in size and scope and not die entirely. Via still exists, but I bet they'd like to do a few things differently could they jump into a time machine and head back to 1999.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: apoppin
Originally posted by: Martimus
That was probably the dumbest "article" I have ever read. Whoever wrote that seems to have no idea what he/she is talking about.

That being said, it will be interesting to see if this rumor has any truth to it. It would be great to have another player in the mainstream CPU business!

in Jensen's (Jen-Hsun Huang, CEO Nvidia) latest 38-minute interview with Charlie Rose this last Friday night on PBS, he said NO

Regarding Intel, he said, "there is no need to reinvent the CPU" and Jensen actually said the core of the ?tension? between Nvidia and Intel ?is the battle for the soul of the PC?.
:Q

Nvidia wants the GPU to become *at least as important* as the CPU as the world moves to cloud computing to compliment client computing
[or supplant it, more likely]

it is an awesome interview about Nvidia's "intuition" for the future of expanding computing; one of Rose's best tech interviews
rose.gif

While it is obvious why Jensen would say this it is also obvious why they can't really be believed when they say this.

There once was a world where CPU makers co-existed with Chipset makers (Ali, NV, Via, Ati etc)...Jensen watched that world collapse as the CPU makers wanted larger and larger pieces of the chipset pie.

In the world where CPU maker co-existed with discreet GPU makers it seemed like there was enough sales dollars to go around. But once AMD decided to incorporate ATi into their business model, and Intel elected to go after the discreet GPU with their ground-up Larrabee business model, Jensen and Nvidia would be the fools in this scenario to not see where the industry is headed.

They will become the next Ali of the GPU world if they do not get ahead of the coming transition, as AMD did. But of course they'd be a fool to tip this hand as well, so it is completely understandable why Jensen says what he says. But we'd be the fools if we believed what he said.

Nvidia simply has no choice. If they want to survive they must diversify their business model at least as quickly as Intel and AMD, and given that AMD and Intel are much larger than Nvidia I would argue that it is imperative for Nvidia to diversify their business model at a rate that is much faster than Intel's and AMD's.

Oh they could scale-down in size and scope and not die entirely. Via still exists, but I bet they'd like to do a few things differently could they jump into a time machine and head back to 1999.

if you actually listened to his interview, you would realize that the GPU has just started to revolutionize the way we do computing

their hardcore gaming business is less than 30 of their total

they have got a vision to put a "GPU inside"
well .. everywhere intel has a "cpu inside"

they are putting a GPU into Audi this year .. it is the beginning of Nvidia's diversification

their cuda is taught in 50 universities and they will lead in real-time imaging .. they already work with every maker of CAT scan equipment

Jensen's vision is just getting started

i am certain they keep their options open - he said they are going to invest MORE this year into R&D than last - but right now X86 looks like a distraction to them
- otoh intel is entering Nvidia's "territory" with Larabee - because nvidia is right!

 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin

if you actually listened to his interview, you would realize that the GPU has just started to revolutionize the way we do computing

1. why are you implying I did not listen to his interview?

2. why are you implying I don't realize what the GPU has started?
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Really nothing new here.
The rumor is so old. The only way they would get rights to use x86 is if Intel stops making money off it and sells it to them.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Cogman

Someone pointed out at slashdot that at least parts of the x86 license should be coming into public domain (as patents only last 20 years).

What is available and will be soon free , Nvidia and the rest of the EE world already knows.
The next patent to be freed up will be the instructions of the 486 . Extremely old tech .
The pentium chipset instructions won't be up till 2013.

I would love to see more competition, but for now trying to develop another x86 chip without intel patents is nearly impossible.


Remember its not the layout or the circuitry that is the patent problem. It's the instruction set used for the programs that run on it. Nvidia could make a cpu that only ran the 486 instruction set , but I wouldn't want to be the one convincing programmers to only use those instructions.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Can we give it a rest of trying to connect the words "x86" and "nVIDIA"? I think this subjects been beaten to death.

If intel has x86, nVIDIA has CUDA nuff said.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
I foresee a "fusion" between the regular posters in the video card forum and the cpu forum. And it might get not so pretty...

idc, brace yourself.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: magreen
I foresee a "fusion" between the regular posters in the video card forum and the cpu forum. And it might get not so pretty...

idc, brace yourself.

But all that heat shoved under one IHS!? Somethings gonna melt, or smoke, or both! :laugh:
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
If they go for the hardware emulation route i will surelly hoot for then.. just imagine a nvidia x86 soc cpu!!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Idontcare
Originally posted by: apoppin

if you actually listened to his interview, you would realize that the GPU has just started to revolutionize the way we do computing

1. why are you implying I did not listen to his interview?

2. why are you implying I don't realize what the GPU has started?

By your comments :p

You said they need to diversify; Nvidia is doing it at a faster rate than apparently what you think. i saw their preview at Nvision08 of their diversification and i am impressed.
. . . cloud computing and super computers ... CUDA .. the GPU as "all purpose" ... 3D for games and TV .. physX .. medical imaging .. the auto, entertainment and the oil industry .. mobile computing ..
.. how much more diversification does Nvidia need?

i listened to the Rose interview with Jensen, twice .. and took good notes the first time
rose.gif


 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: apoppin

if you actually listened to his interview, you would realize that the GPU has just started to revolutionize the way we do computing

1. why are you implying I did not listen to his interview?

2. why are you implying I don't realize what the GPU has started?

By your comments :p

You said they need to diversify; Nvidia is doing it at a faster rate than apparently what you think. i saw their preview at Nvision08 of their diversification and i am impressed.
. . . cloud computing and super computers ... CUDA .. the GPU as "all purpose" ... 3D for games and TV .. physX .. medical imaging .. the auto, entertainment and the oil industry .. mobile computing ..
.. how much more diversification does Nvidia need?

i listened to the Rose interview with Jensen, twice .. and took good notes the first time
rose.gif

Having hype catchphrases and slideware does not constitute diversification.

It may constitute a plan to eventually become diversified, and having a plan to become diversified certainly would not be in disagreement with anyone stating they need to become diversified.

I stand by the statements I made in my post and I don't see where you've said anything that actually disagrees with them.
 

mrSHEiK124

Lifer
Mar 6, 2004
11,488
2
0
I just wish they'd find a way to make my GTX260 not so useless when I'm not gaming (which is 99% of the time)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: mrSHEiK124
I just wish they'd find a way to make my GTX260 not so useless when I'm not gaming (which is 99% of the time)

Ha ha, the same could be said of your CPU :p

Think about what you are saying...apply it to your car when you aren't driving, or your house when you are at work, or the money in your checking account earning 0% APR.

GPU does what it was designed and sold to do, it is your job to make sure it has something to do for you. If all else fails checkout F@H.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Well, except here the gpu really might be able to lend a hand with other intensive compute tasks -- it's not just a car he isn't driving. I mean, if you've got a blowtorch down in the basement and you're having trouble getting the fire in the fireplace lit with matches... well, you know.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Originally posted by: Idontcare
Originally posted by: mrSHEiK124
I just wish they'd find a way to make my GTX260 not so useless when I'm not gaming (which is 99% of the time)

Ha ha, the same could be said of your CPU :p

Think about what you are saying...apply it to your car when you aren't driving, or your house when you are at work, or the money in your checking account earning 0% APR.

GPU does what it was designed and sold to do, it is your job to make sure it has something to do for you. If all else fails checkout F@H.

My wish goes along the lines of GPUs becoming more power efficient and maybe even implementing a powersaving scheme like modern cpus have (Cool n' Quiet).
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
They already do (well high end cards anyway). Normally low/mid range cards consume somewhere between 10~30W idle to 50~80W during load. Current highend cards such as the GTX series cards consume ~25Ws during idle which is very impressive. These power saving schemes will get better, and probably be implemented only for the high end seeing as the next gen 40nm GT218 has a TDP rating of 22W!

Im just hoping that CUDA will eventually become mainstream.