• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

"Does nVidia have a future?"

In all seriousness, the thing that has me curious about Tegra more then anything is if it is SLIable. May sound foolish, but it seems to me that say, four way Tegra would make for an incredible handheld gaming device. 4 watts of power and able to run Quake at ~150FPS would be extremely impressive.
 
As things stand at the moment, there?s no question that Nvidia can build a better integrated GPU than Intel ? it has done so for years. The problem is that integrated graphics aren?t really powerful enough for gaming and if both Intel and AMD can deliver an integrated CPU/GPU that can easily handle Blu-ray and HD YouTube video playback, there?s no need for an integrated graphics chipset from Nvidia. We?ve constantly quizzed Nvidia on this point, but it has yet to come back to us with a convincing answer.

I think this quote is extremely important. The problem for nVidia is that their entire motherboard and integrated GPU chipset business is likely to die in the future. Both AMD and Intel will be coming out with on die GPU's and it is not hard (especially with process shrinks) to imagine a multi-core CPU with a GPU core that's on the same die. While AMD's Radeon series will suffer as well, at least they will (hopefully) get something back in the form of increased dollars for an integrated platform.

I can easily see a dual core CPU and an on die GPU that can do HD video playback being all that a general consumer would need. Just look at Intel's upcoming Pine Trail platform. AMD has Fusion in the works (which they've talked about forever) and it'll take a while to get here due to AMD's lack of funds but it will get here eventually and a similar platform will come from Intel. This will eventually evolve into a System-on-Chip (SoC) with almost everything on one die.

A complete SoC system is highly desirably in the mobile device and compact desktop markets because it should lead to easier to design motherboards as well as being easier to design a notebook around since there is less areas where you need to build a cooling system for since most of the heat generating parts will be concentrated on the SoC. Designed correctly, it should also lead to lower power consumption as well for longer lasting devices.

GPU is nVidia's forte but if the integrated GPU is good enough for 90% of consumers who only use it for stuff like video decoding and light gaming, nVidia is going to be relegated to a smaller and smaller market as integrated GPU's get better and better. While GPGPU and high end discrete GPU will still be a profitable market, the low and probably mid-end GPU's of today may disappear as consumer integrated GPU's become "good enough" for the general consumer. That's gonna be a huge blow to nVidia if they don't find other markets.

I think nVidia recognizes this as well with their GPGPU initiative and advancement into other markets as evidenced by the Tegra platform. I think Tegra might be more important that GPGPU for nVidia's long term survival. I don't think it can be underestimated how much mobile devices, and I'm lumping smart phones (which will largely replace any remaining PDA's), media players, netbooks, and tablet PC's into the mobile devices category.

With the advent of the iTunes app store and similar competing app stores like the Zune store I think that Microsoft's stranglehold on the OS business isn't going to help it in the mobile market. When talking about PC's, Linux has failed because even though a lot of software is free, most consumers expect a PC to work with Windows based software they are used to. I think that app stores will replace the "Why doesn't XYZ Windows software work with my PC" mentality to one of "Does the XYZ app store have a program to do this."

Mobile devices as a market is only going to get bigger and bigger IMHO while the PC market is largely stabilized and most users buy new PC's to upgrade their existing PC's while mobile devices will be complements to the PC. I can easily see families with one to two PC's but have upwards of five mobile devices in a household. I think nVidia recognizes this as well since they've spent upwards of 500 million dollars developing Tegra.

The above is just my thoughts running rampant. If anything comes off as incoherent please excuse what I've written. 🙂
 
It also relies heavily on the ARM CPU architecture becoming more relevant outside of the mobile phone market because, currently, most software is developed for x86 processors. ARM is confident though and it recently said that it believes that, by 2013, Microsoft will have no choice but to support its CPUs in its fully fledged Windows OSes

The Radeon HD 2900 XT... the card that caused Nvidia to take its foot off the gas

So this was ATI's strategy afterall! Now it (2900XT) makes sense!

Its partnership with ARM in particular, is an interesting one because ARM is also looking to broaden its horizons. It?s a massive player in the mobile phone market and the company is a sleeping giant. As mobile phones and personal computers converge, there?s an interesting battle brewing between Intel and ARM ? and I think there?s more to it than meets the eye
 
We need a complete overhaul in Intellectual Property laws. Especially tech-related ones. I think most will agree that the time goes much faster in technology than anywhere else in our life.
 
- NV should try everything possible to get inside Wii 2.
- Since Intel is doing just fine selling their slow integrated 4500HD in laptops, NV should create a special team that designs a mobile gpu from scratch (just like Pentium M was designed to excel as a mobile CPU first). Focus on power consumption and performance secondary and you have the Apple laptop and Asus netbook markets
- Somehow entice a few gaming companies to create engines more powerful than Crytek's to bring next 2-3 generations of graphics cards to their knees forcing us to upgrade :evil:
- Get cuda into professional space (i.e., complex stock option models for banks, cuda for science apps, etc.)
 
Originally posted by: lopri
We need a complete overhaul in Intellectual Property laws. Especially tech-related ones. I think most will agree that the time goes much faster in technology than anywhere else in our life.

Indeed. The x86 license held so tightly is stifling tech advancement.
 
Originally posted by: Kakkoii
Originally posted by: akugami
I can easily see a dual core CPU and an on die GPU that can do HD video playback being all that a general consumer would need.

Nvidia's Tegra platform can already do that. And boy does it do it well.

http://apcmag.com/nvidia-tegra...ayback-under-1watt.htm

http://n4g.com/industrynews/News-347495.aspx

I realize that, but x86 is so entrenched for desktops and regular notebooks that nVidia will have an extremely tough time of breaking the Windows dominance which is about the only way for there to be a large shift from x86 to something else.

I mentioned Tegra in my first post and I do truly believe that it is a capable system. It just won't break the x86 market but rather complement it. As in my first post I think that the main PC in a house will still revolve around x86 (or x86-64) and Windows while mobile devices are in a high state of flux. There is a very real opportunity for nVidia to dominate large segments of the mobile market from netbooks to smart phones. I just don't see Tegra making any real headway against x86.
 
I sat at a light next to a beat up pick up and glanced over to the hispanic driver. He had dried paint specs covering his hands and forearms, and an iPhone in hand with white corded earbuds in place. The imagery served as a reminder of how much a tech junkie society we live in today. It's not just the Seattle coffee shop geeks, it's people everywhere from all ages and all walks of life, plugged in to technology, craving more.

nVidia is cognizant of how we're plugged in, and they've positioned themselves quite nicely when you think about it. It's not just about who won the last round in mid range graphics. It's also about getting into devices the masses crave and carry. When corporate logos adorn space faring vehicles, nVidia's will be there..
 
I really dont see the big deal with the chipset business dying. It was low margin anyways. And since the advent of on CPU memory controllers. What exactly do we get out of a south bridge that would make me pick one over another?

The integrated discussion brings me to this point. So what? What is different in the future that hasnt been happening for years? Intel dominates the market with integrated garbage. AMD will finally get into the game. But I dont expect it to really affect Nvidia that much since they really havent had a leg up in the market ever. They can still present a value in the mid to performance market.

Nvidia's GPGPU will make or break them down the road. But I still think the biggest question among the three(Intel, AMD, Nvidia). Will AMD be a functioning company by the time they realize any technological gains from their purchase of ATI?
 
Originally posted by: Genx87
I really dont see the big deal with the chipset business dying. It was low margin anyways. And since the advent of on CPU memory controllers. What exactly do we get out of a south bridge that would make me pick one over another?
Wouldn't you ask that question for Intel? It is Intel that has been protecting its 'bus(es)' via new patents and law suits threats.
 
Originally posted by: Leyawiin
Originally posted by: lopri
We need a complete overhaul in Intellectual Property laws. Especially tech-related ones. I think most will agree that the time goes much faster in technology than anywhere else in our life.

Indeed. The x86 license held so tightly is stifling tech advancement.

What IP and patents are you guys referring to? The x86 ones expired a decade ago.

I worked for a company that still holds an x86 license, Texas Instruments. In 1995 we were producing our own 486 processor clocked to 80MHz. In 1996 we took it to 100MHz and they were selling for $100 (our internal slogan/milestone was "$1 per MHz or bust"). We never produced a pentium-class CPU not for lack of a license or the experience in the x86 market but because of the massive complexity and costs associated with continuing to design competitive x86 processors.

This myth that the barrier to entry into the x86 marketspace is caused by lawyers is really pervasive but lacks substance. The barrier to entry is the roughly $1B you need plus 4-6 years to design a modern-day x86 processor which will actually be competitive with whatever AMD and Intel spend their billions designing in the meantime.

TI, Cyrix, TransMeta, IDT...we all bowed out of the industry for one reason only - we either couldn't compete (lack of resources) or we had leadership that elected to not compete (spent the resources on other core strategies).

No IP issues holding Via back, and you don't see an awe-inspiring breadth of x86-based innovative products coming from them. For their budget the Isaiah chip is impressive, but it is as expected given their budget. Give them even less of a budget, but still the same x86 license and you'd have even less product depth. No boogy-man IP stuff going on there.
 
Originally posted by: Genx87
I really dont see the big deal with the chipset business dying. It was low margin anyways. And since the advent of on CPU memory controllers. What exactly do we get out of a south bridge that would make me pick one over another?

The integrated discussion brings me to this point. So what? What is different in the future that hasnt been happening for years? Intel dominates the market with integrated garbage. AMD will finally get into the game. But I dont expect it to really affect Nvidia that much since they really havent had a leg up in the market ever. They can still present a value in the mid to performance market. ?

The thing is "low margin" doesn't mean "low profit" when you sell zillions of them. Discrete GPU's are priced as a high margin but lower selling part. They make a lot per unit but they don't sell a lot of units. Integrated chipsets are lower margin but considering the volume of sales, it is a highly profitable market. Discrete GPU's don't need to be great, just good enough. While you consider Intel GPU's crap, the fact is that they're good enough for business systems that aren't running 3D intensive apps. They're also good enough for Dad or Grandma whose most demanding PC tasks are just surfing the web and watching a few DVD's.
 
Originally posted by: akugami
Originally posted by: Genx87
I really dont see the big deal with the chipset business dying. It was low margin anyways. And since the advent of on CPU memory controllers. What exactly do we get out of a south bridge that would make me pick one over another?

The integrated discussion brings me to this point. So what? What is different in the future that hasnt been happening for years? Intel dominates the market with integrated garbage. AMD will finally get into the game. But I dont expect it to really affect Nvidia that much since they really havent had a leg up in the market ever. They can still present a value in the mid to performance market. ?

The thing is "low margin" doesn't mean "low profit" when you sell zillions of them. Discrete GPU's are priced as a high margin but lower selling part. They make a lot per unit but they don't sell a lot of units. Integrated chipsets are lower margin but considering the volume of sales, it is a highly profitable market. Discrete GPU's don't need to be great, just good enough. While you consider Intel GPU's crap, the fact is that they're good enough for business systems that aren't running 3D intensive apps. They're also good enough for Dad or Grandma whose most demanding PC tasks are just surfing the web and watching a few DVD's.

Is it? I havent really paid that much attention to the chipset business. I remember at the turn of the century Intel's division that sold their chipsets routinely tunred in losses. But the value add sold more CPU's to the net was a profit. Nvidia doesnt have a CPU to sell and they can do fine licensing out SLI on Intel and AMD chipsets.

My point about the integrated chipsets is the article basically said the status quo is going to continue. Which I agree. Intel is going to own the integrated chipset market. But that doesnt mean Nvidia is going to be hurt by that prediction. There is still a market for higher performing parts. Larrabee is the unknown variable in this argument.

/shrug
 
Originally posted by: Idontcare
Originally posted by: Leyawiin
Originally posted by: lopri
We need a complete overhaul in Intellectual Property laws. Especially tech-related ones. I think most will agree that the time goes much faster in technology than anywhere else in our life.

Indeed. The x86 license held so tightly is stifling tech advancement.

What IP and patents are you guys referring to? The x86 ones expired a decade ago.

I worked for a company that still holds an x86 license, Texas Instruments. In 1995 we were producing our own 486 processor clocked to 80MHz. In 1996 we took it to 100MHz and they were selling for $100 (our internal slogan/milestone was "$1 per MHz or bust"). We never produced a pentium-class CPU not for lack of a license or the experience in the x86 market but because of the massive complexity and costs associated with continuing to design competitive x86 processors.

This myth that the barrier to entry into the x86 marketspace is caused by lawyers is really pervasive but lacks substance. The barrier to entry is the roughly $1B you need plus 4-6 years to design a modern-day x86 processor which will actually be competitive with whatever AMD and Intel spend their billions designing in the meantime.

TI, Cyrix, TransMeta, IDT...we all bowed out of the industry for one reason only - we either couldn't compete (lack of resources) or we had leadership that elected to not compete (spent the resources on other core strategies).

No IP issues holding Via back, and you don't see an awe-inspiring breadth of x86-based innovative products coming from them. For their budget the Isaiah chip is impressive, but it is as expected given their budget. Give them even less of a budget, but still the same x86 license and you'd have even less product depth. No boogy-man IP stuff going on there.

So, you're saying that if NVIDIA put out an x86 chip next week (even a crappy one), Intel wouldn't take them to court?
 
Originally posted by: nitromullet


So, you're saying that if NVIDIA put out an x86 chip next week (even a crappy one), Intel wouldn't take them to court?



Like was said above, completely depends upon what x86 instruction sets they used. If the ones used were the freely available ones, then no, Intel would have no standing to sue. Of course, nvidia would have produced something akin to a 486 cpu...and what would be the point of doing that? The money invested to do that would almost be better flushed down a toilet as even Via's current cpus would be faster.

Now, if nvidia were to produce a cpu using currently protected x86 instruction sets, then yes, Intel would sue and have grounds, as it stands right now.

 
Originally posted by: Beanie46
Originally posted by: nitromullet


So, you're saying that if NVIDIA put out an x86 chip next week (even a crappy one), Intel wouldn't take them to court?



Like was said above, completely depends upon what x86 instruction sets they used. If the ones used were the freely available ones, then no, Intel would have no standing to sue. Of course, nvidia would have produced something akin to a 486 cpu...and what would be the point of doing that? The money invested to do that would almost be better flushed down a toilet as even Via's current cpus would be faster.

Now, if nvidia were to produce a cpu using currently protected x86 instruction sets, then yes, Intel would sue and have grounds, as it stands right now.

So, essentially we're just arguing semantics... The base x86 instruction set's patent has expired, but I think it's pretty safe to say that when people are discussing the "x86 license" they are referring to the base x86 instruction set license plus cross licensed extensions. You know, basically the licenses required to build a competitive cpu and not just a sketch on a napkin of an abacus.
 
Originally posted by: Genx87
Originally posted by: akugami
Originally posted by: Genx87
I really dont see the big deal with the chipset business dying. It was low margin anyways. And since the advent of on CPU memory controllers. What exactly do we get out of a south bridge that would make me pick one over another?

The integrated discussion brings me to this point. So what? What is different in the future that hasnt been happening for years? Intel dominates the market with integrated garbage. AMD will finally get into the game. But I dont expect it to really affect Nvidia that much since they really havent had a leg up in the market ever. They can still present a value in the mid to performance market. ?

The thing is "low margin" doesn't mean "low profit" when you sell zillions of them. Discrete GPU's are priced as a high margin but lower selling part. They make a lot per unit but they don't sell a lot of units. Integrated chipsets are lower margin but considering the volume of sales, it is a highly profitable market. Discrete GPU's don't need to be great, just good enough. While you consider Intel GPU's crap, the fact is that they're good enough for business systems that aren't running 3D intensive apps. They're also good enough for Dad or Grandma whose most demanding PC tasks are just surfing the web and watching a few DVD's.

Is it? I havent really paid that much attention to the chipset business. I remember at the turn of the century Intel's division that sold their chipsets routinely tunred in losses. But the value add sold more CPU's to the net was a profit. Nvidia doesnt have a CPU to sell and they can do fine licensing out SLI on Intel and AMD chipsets.

My point about the integrated chipsets is the article basically said the status quo is going to continue. Which I agree. Intel is going to own the integrated chipset market. But that doesnt mean Nvidia is going to be hurt by that prediction. There is still a market for higher performing parts. Larrabee is the unknown variable in this argument.

/shrug

Intel's chipset revenues were close to 2 billion dollars last quarter. I find it highly unlikely they are losing money on chipsets.

They are also using their own older fabs so their fabs have a longer life cycle (cost savings) but also just as important is they don't have to pay anyone else to manufacture the products for them and thereby reduce paying a middle man. There's no question that Intel's chipsets are low margin compared to chipsets from other companies like nVidia but Intel sells so much of them that I find it hard to believe they don't make money. nvidia sure as heck does. Why else would they be fighting for the right to produce motherboard chipsets.
 
Originally posted by: nitromullet
So, essentially we're just arguing semantics... The base x86 instruction set's patent has expired, but I think it's pretty safe to say that when people are discussing the "x86 license" they are referring to the base x86 instruction set license plus cross licensed extensions. You know, basically the licenses required to build a competitive cpu and not just a sketch on a napkin of an abacus.
Exactly. Imagine, for example, the followings:

- Samsung starts selling TV's with Atom sockets, as well as new algorithms to upscale and post-process high-definition video. The TV itself gives great image quality, but once it detects those user installed chips, it will give even better quality or other special effects of choice.

- Sony develops new PlayStation based on X86 with user-upgradable X86 CPU daughter boards, and users can swap the CPU/daughter board to their likings. Your PlayStation seems slow? Just by a socket 1366 daughter board from Sony and pop in an i7.

- TSMC starts taking orders from NV's x86 CPUs, that support all known instruction sets and extentions, and run Windows. NV has its own chipsets and sockets, but it plans to order chips for AMD/Intel's sockets as well.

These are some extreme examples, but they do the job to illustrate what patent laws are about. Current U.S. law gives base 20 years for a patent, and it can be extended at ease. Eldred v. Ashcroft, a landmark decision, even prolonged the exclusive right for extra 75 years when it comes to copyright. Granted copyrights and patents are not the same, but they are all considered intellectual property, along with trademarks.

20 years is an eternity in computing world, let alone 95 years.
 
Back
Top