apoppin
Lifer
doubtful in a PCI wonder if we'll ever see the day where we can just plug a graphics card into a motherboard and boot up
it is a marriage
doubtful in a PCI wonder if we'll ever see the day where we can just plug a graphics card into a motherboard and boot up
Originally posted by: apoppin
doubtful
it is a marriage
![]()
Originally posted by: SickBeast
Originally posted by: apoppin
doubtful
it is a marriage
![]()
Unless they make something like the Fusion.
Originally posted by: DrMrLordX
Originally posted by: SickBeast
x8
That's some pretty great info, thanks!
I always thought that x86 was a basic instruction set. Should that not have been created way more than 17 years ago? If you're right on this, it should give NV the right to create an x86 processor provided they engineer it from the ground up and do not infringe on any of the more modern patents.
x86 is much older than 17 years, yes. Extensions to the instruction set were made with nearly every upgrade of processor generations throughout the years, including some baby-step evolutionary upgrades between generations. And glad to be of assistance.
Originally posted by: aka1nas
Yeah, but what do you actually get out of that? A basic 8(or even 16-bit) x86 processor can't run modern software. They would really need an x86-64 license (among others) to make something usable.
It gets you a 32-bit processor (286s, 386s, 486s, and Pentiums were all 32-bit) that could boot Windows 7, at least in theory, if not in practice. Might be slow as balls but you get the idea. If the patents behind the P6 core are now in public domain, it might enable Nvidia to reverse-engineer a Klamath-core P2 or something (or at least a Pentium Pro). I doubt they would have any legal answer to X86-64 or EM64T, but it sure would be interesting to see whether or not AMD would license those extensions to Nvidia. My guess would be no, but you never know.
Anyway they might be able to do something with that, but it would take a lot of resources they don't necessarily have, and whatever they created would probably wind up looking a lot like a Cell processor (they'd have to use multiple vector processors to make up for a lack of native SSE/SSE2/SSE3/SSE4 support, and they'd have to do instruction translation in hardware to feed SIMD instructions to the vector processors, and I don't even know if they can do that legally).
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.
Originally posted by: alkalinetaupehat
Ya know, hot wings are so much better than popcorn for stuff like this. You can throw the bones at the loser!
AMD fusion looks like it will arrive YEARS after the intel fusion.AMD's Fusion looks to be a step over IG
Originally posted by: taltamir
AMD fusion looks like it will arrive YEARS after the intel fusion.AMD's Fusion looks to be a step over IG
Originally posted by: Idontcare
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.
With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.
How do you know?AMD fusion looks like it will arrive YEARS after the intel fusion.
Originally posted by: taltamir
patens are NOT copy protection... if nvidia engineers it from the ground up it STILL infringes on the patent... take for example the following patent:
"a device for making of sandwiches by means of placing the ingredients on two opposing platters which are then combined by the machine"
This is a real patent, that mcdonnalds got for a burger making machine. It does not matter if someone designes a machine from the ground up that looks nothing LIKE the mcdonnalds machine it is still infringing on their PATENT.
Originally posted by: taltamir
hell no, nothing will run on windows 95, and an 8 bit processor in this day and age is USELESS!
The chinese are interested in CPU independence, but they are unwilling to ignore international treaties for patents for it, yet... so they developed a 32bit (and now a 64bit version) chip that does NOT use the x86 instruction set, and runs a specially compiled version of linux.
Their latest version will do SOFTWARE emulation of x86 (which they claim is legal, and intel says they are looking into, but if its truely software its legal, but will have extremely poor performance).
There was some other company, they tried to make a chip that uses very long instruction words and software emulation of x86 (which again, does not infringe on patents), but it was so unimaginably slow that it failed.
Originally posted by: apoppin
Transmeta
it definitely worked .. it had several misfires and bad company decisions
- it was an "almost" .. but its technology is still used
.. but now maybe the GPU can do it much faster
- if so, Intel has much to fear
:clock:
Originally posted by: SickBeast
Well then...it looks to me like NV will have better graphics performance, but intel will have better x86 performance. It's interesting. NV is emulating x86, whereas intel is sort of emulating a GPU.
I look forward to NV releasing something x86-compatible. I wonder if we'll ever see the day where we can just plug a graphics card into a motherboard and boot up - no CPU or memory required!![]()
Originally posted by: Fox5
SSE2 support is a required part of x86-64, and many apps rely on its presence even on normal x86.
Originally posted by: Idontcare
With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.
Originally posted by: apoppin
How do you know?AMD fusion looks like it will arrive YEARS after the intel fusion.
AMD blindsided Nvidia with 4850 and 4870 series. They are keeping Fusion under wraps. Who knows if they had any breakthrough. i'd much rather think ATi's GPU engineers can teach AMD's CPU engineers how to build a CPU-GPU easier than watch Intel's "CPU only" engineers try to force a CPU to act like a GPU![]()
At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.
Originally posted by: Spicedaddy
AMD and Intel are going to integrate decent GPUs with their CPUs, so the stand-alone GPU market is only going to get smaller in the future.
the original suggestion was that Nvidia buy VIA and sell it off to get at the x86 license held by SIS; then they would have Nvidia Graphics and SIS CPUs as a division of NvidiaVIA is still capable of producing x86 processors thanks to their buyouts of Cyrix and Centaur, though how far their license goes is anyone's guess.
Originally posted by: apoppin
Originally posted by: Idontcare
Originally posted by: apoppin
see, Intel wants "intel inside" everything that can use a CPU .. Jensen has a vision of "Nvidia inside" everything that has potential to use a GPU.
With Intel's access to leading edge process technology seconded only by AMD, Nvidia stands to be third in this foot-race if they continue to rely on TSMC to produce the chips for them as we march towards 16nm and beyond.
they do have access to other foundries .. if that is *all* you are concerned about
![]()
Originally posted by: apoppin
well, isn't AMD building *brand new* foundries that they plan to open also to their competitors?
![]()
Originally posted by: aigomorla
Originally posted by: taltamir
Crap? they were vastly superior most of the time, .
how?
Nvidia was superior to the P35, P45, X38 and X48?
Please u best take that statement back.
No they werent vastly superior for its time.
More like problem prone compared to the other chipsets.
And yes ive owned the 680 780 and 790 for a short time. It was crap compared to my DFI LT X38 and X48 and lets not get started on how badly the UD3P will kill Nvidia boards.
Originally posted by: apoppin
the original suggestion was that Nvidia buy VIA and sell it off to get at the x86 license held by SIS; then they would have Nvidia Graphics and SIS CPUs as a division of Nvidia
but then do they want their own x86 or do they want to develop the GPU further so it is indispensable to every PC ?
