Nvidia GPUs soon a fading memory?

Status
Not open for further replies.

dzoner

Banned
Feb 21, 2010
114
0
0
I think they will be.

Nvidia is looking at a world of hurt, within two years losing the ability to effectively compete in the notebook, low end and mainstream computer markets and most of the high end market as AMD's fusion line move to incorporates bulldozer and northern islands with the already potent on chip graphics able to seamlessly incorporate motherboard and discrete graphics as needed providing a compelling reason to buy AMD high end discrete graphics boards on AMD systems and providing an integrated computing enviroment neither Nvidia or Intel alone can effectively compete with.

Meanwhile, with Intel no longer able to use their 800 lbs. of gorillaness to strongarm OEM's, AMD racking up scads of notebook design wins and with their Fusion line-up in the 2011 wings looking ready to take it to the next level with Intel across the computing devices boards with newer, better fusion designs coming yearly and Globalfoundries looking increasingly able to decimate Intel's time and technology lead on future process nodes, Intel is losing some vital competitive edges and looks to start losing substantial market share and profits to AMD in 2011 and beyond and will need to get serious about acquiring graphics capabilities to help stop the bleeding.

There is only one company that has such resources and it is currently massively faceplanting ... becoming a low hanging and very juicy fruit ripe for the plucking.

The only real obstacle to that plucking would be the U.S. government, but AMD's resurgence and rosy competitive future might just provide argumentative fodder sufficient to overcome the government objections.


Necro thread.

Super Moderator BFG10K.
 
Last edited by a moderator:

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Don't forget JHH's ego. That's going to be a massive obstacle to NV ever being a viable acquisition.

I don't think the situation is nearly as dire as you describe. AMD was in a far worse competitive position (having zero highly desirable CPU or GPU products) through all of 2007, 2008 and most of 2009. During a recession. And they're still alive and kicking.

NV is working on getting into new markets with their GPGPU efforts. They still have a top notch marketing department adept at milking vacuum and a loyal, rabid fan base. They're not going anywhere any time soon.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
If it wasn't for the laws and government there would be no ATI, AMD or Nvidia. Intel would have swallowed them up years ago.

Amd will never be more then a thorn in Intels side. Intel needs them to keep the government from splitting their company up.

Nvidia is moving on to different, bigger and better things in the future. They will be around for many years unless Intel swallows them up and then AMD is in bigger trouble.

I still see Nvidia making profits year after year,they must be doing something right?
On the other hand AMD has just made it out of the red last year sometime,thanks to a hefty Intel lawsuit.

Larabee will eat fusion for breakfast.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I have been a fan of AMD for 15 years. The story you describe has been the same story I have heard and told myself for the past 15 years. I think in 2 years AMD may gain a little market share in discrete markets against nvidia. But desktop and integrated will be limited by their ability to sell CPUs. Intel simply wins in performance and ability to keep AMD in their place. When it comes to integrated graphics the avg consumer doesnt give a shit. As evidenced by Intels what, 60-65% of the overall graphics market?
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Depends how well Fusion works out, I don't think that nVidia will simply die, they have other markets to tap into.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
I have been a fan of AMD for 15 years. The story you describe has been the same story I have heard and told myself for the past 15 years. I think in 2 years AMD may gain a little market share in discrete markets against nvidia. But desktop and integrated will be limited by their ability to sell CPUs. Intel simply wins in performance and ability to keep AMD in their place. When it comes to integrated graphics the avg consumer doesnt give a shit. As evidenced by Intels what, 60-65% of the overall graphics market?

Yah, that's because with Win XP, you didn't need anything more than 2D graphics. With Vista and Win 7 the introduction of AERO and other fun GUI apps that require a bit more than 2d acceleration is showing a trend by the most popular OS in the world to move to more advanced platform. This means using better discrete graphic cards in the future.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I have to totally disagree with OP, I think AMD's fusion line is going to provide a very interesting and competitive product that might help them gain 10% more market share but they aren't going to threaten Intel's monopoly with it, just force more innovation and competition from the 2.

As for nVidia they aren't in any trouble at all. Even if they fail this round AND next round of GPU's they aren't going to hurt. I actually expect the 3rd generation of Fermi to be the one that really breaks out as a success. nVidia tends to design their architecture way too large for the current process, they seem to plan their architecture to not be truly viable till one or 2 die shrinks so we should see it really pay off in a year.

The downside to that strategy is new features take a long time to enter the design since they want the architecture to last many years, and then even longer after that to trickle down to their low end parts, so it diminishes their adaptability. This is why nVidia worked so hard to postpone the features that were suppose to be in the lackluster DX10 to not show up till DX10.1 and DX11. They were still making use of their G80/G92/GT200 architecture and didn't want to redesign it yet.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I think Nvidia's future largely depends on markets that we aren't used to them being a huge part of, at least not be a part of until recent times. Nvidia used to be huge in the chipset market, and I think their bread and butter was GeForce for most of their history. It seems like their chipset business is just about dead and their focus is shifting away from the gaming world. I think for Nvidia to make it long term they'll need to do well in some other markets.

I think discreet video cards will continue to be a big part of their business, but I am willing to guess that as time goes on it'll be less and less. Nvidia has their foot in the door of the HPC world, but it remains to be seen where that takes them.

They can't make a x86 processor, so it seems they are trying to push their GPU for general computing. For some types of tasks their GPU really makes sense. But we'll have to see how far that can take them. AMD's Fusion, a CPU with SP's from the Radeons, can really shoot AMD up in the HPC/GPGPU world.

Tegra is also an interesting part. It seems that Nvidia is working hard at making it's way into the mobile computing world, but I don't know that much is really being done with Tegra yet. (By the way, a while back Charlie posted an interesting story about Tegra: http://www.semiaccurate.com/2010/04/14/samsung-dumps-nvidia-tegra/ If you are just going to blow it off because it's Charlie, than I wouldn't bother reading, it has his typical anti-Nvidia spin. But I also thought it was interesting, Nvidia does talk about their Tegra wins all the time, but I don't see many - if any - devices with Tegra, this story talks about that.)

I don't know that the government would allow for Intel and Nvidia to merge. Intel is the largest CPU provider, Nvidia is the largest GPU provider. Add in Intel's recent troubles with the law regarding their unfair business practices and I just don't know that it would be allowed to happen by the government.
 

sandorski

No Lifer
Oct 10, 1999
70,678
6,250
126
Too rosy I think. However, back in the day when Intel shutout AMD from the Socket/Platform side of things, everyone thought AMD was dead. AMD took the bull by the horns and has been better off ever since. From a mere follower of Intel to an Industry leader who Designed Platforms, influenced the Future of RAM adaptation, and bringing to market numerous Innovations/Technologies, but most of all increasing its' Marketshare by some 50%(IIRC, possibly more) over its' Following days. Despite the David and Goliath difference between Intel and AMD, AMD has managed to keep up and even surpass Intel at times.

Not too shabby IMO. Really makes you wonder if at some point AMD does manage to turn the tables with Intel. Mindshare is far less an issue for AMD these days, likely because of its' ATI acquisition, but even without that the Socket A and Socket 939 Athlons really proved that AMD can go toe to toe with Intel and Win technologically speaking. AMD/ATI and upcoming integration on the most solid AMD Chipsets/Platforms ever made is bound to have a positive impact. How much is uncertain, but AMD is entering an era of where it's Technological position is the strongest ever with little sign of weaknesses. Should be an interesting next few years.
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
Well, I think Intel learned from the P4 fiasco and won't ever do that again. IMO, S939 dominance was a fluke. That's not to say that AMD won't grow and make fine products, but they won't soundly thrash Intel like they did a few years ago.

As for NV, they'll muddle through and probably squeeze out a balanced GPU in the next 18 months, but this has really hurt them in the eyes of enthusiasts. Case in point, I never thought I'd buy another ATI card, but here I am with a 5850 and enjoying every minute of it.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
I'm not too sure about nVidia losing out in the notebook market...
AMD's fusion/platform may lock nVidia out on the AMD side... but the Intel side is larger, and I don't really see Intel delivering high-end solutions yet, so they need nVidia (and/or AMD) for that market.

Then there's Tegra, as mentioned...
And Fermi... could be turned into a success story with the next die-shrink and/or refresh.

nVidia has done it before... the original G80 was pretty 'overweight' aswell. It wasn't that noticeable because there was no competition... but the G92 version was a considerably leaner and meaner version of the architecture.

AMD has done it before aswell... The 2900 being a total dud, but the 3000-series revamped the architecture into something competitive, and the 4000-series revamped it yet again, giving AMD the edge in price/performance in most cases.

Ofcourse Fermi also has a GPGPU-side to it, where I think it really has no competition from AMD, partly because Fermi has incredibly strong performance, especially in double-precision arithmetic, but also because nVidia has a more mature development environment, with Cuda/OpenCL.

Another area where nVidia may have good chances is when the next generation of consoles is coming up, which should probably not be more than 2-3 years away.

So we'll have to see... As mentioned above, nVidia has been profitable so far, so there is still plenty of time to fine-tune their product line and design a new strategy for the coming years, if need be.
Perhaps nVidia will not be the same company as we know it today, but I don't see them disappearing anytime soon.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Uh huh

Intel just spent hundreds of millions of dollars on their Larrabeast Dev Kit :p

First Larrabe was put on hold, not out right cancelled. Second AMD just dropped their hexa-core CPUs to the desktop market and the quad-core i7s still beat them in most tests. How old is the 920 again?

Tegra is used in almost no devices, the only one I know of is the Zune. Tegra2 is going to be used in a few devices but ATI's Snapdragon is used in a hell of a lot more.

Nvidia is also used in the PS3. Meanwhile ATI is used in the 360, Wii, & I believe in the Nintendo hand helds. ATI has already secured the next Xbox consoles as well as Nintendo's next console & hand held.

Both Intel & AMD have pushed Nvidia out of the chipset market and soon the IGP market as well. Yes there is the ION series, but it will have to compete with AMD Fusion and Intel's HD video GPU core. Average people seem to be fine with Intel "HD Graphics." If Intel develps it enough to play back Blu-ray video let alone "HD video" from sources like youtube the masses won't care about ION.

So that leaves GPGPU, well AMD isn't shut out here either. Especially if software adopts OpenCL and/or Direct Compute. Adobe already supports OpenGL and with all new video cards support OpenCL it would make sense for them to adobt it for a broader customer base. And then we can come back to fusion as well. AMD has a base of tech & fields they can dip into & program to work together.

Intel will be fine as usual, while AMD is having some success with their hexa-xore CPUs selling out. ATI is rolling out Southern Islands later this year as the 6000 series with Northern Islands next round with Global Foundries.

Yeah Nvidia is in for some lean years.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
I think OPs prediction comes abit too early. We need atleast a few more months to see what nVidias fate or future will end up as.

In anycase, there is that saying (which i may write with error): What doesnt kill you, only makes you stronger. And this is what happened to ATI, and has happened to all living creatures/entities../companies..whatever troughout time.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
So that leaves GPGPU, well AMD isn't shut out here either. Especially if software adopts OpenCL and/or Direct Compute. Adobe already supports OpenGL and with all new video cards support OpenCL it would make sense for them to adobt it for a broader customer base. And then we can come back to fusion as well. AMD has a base of tech & fields they can dip into & program to work together.

AMD is shut out.
With OpenCL they are about 3 years late to the party. In that time, nVidia has been able to promote Cuda freely to researchers and software vendors.
Adobe has already adopted Cuda for their products. Since AMD still doesn't deliver an OpenCL runtime to end-users, there's little point in supporting OpenCL anyway, as it would only run on nVidia hardware.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I think OPs prediction comes abit too early. We need atleast a few more months to see what nVidias fate or future will end up as.

Yea, if you stop thinking like a techhead for a moment, you come to realize that Fermi isn't really a bad product anyway (the pricing isn't really unreasonable, the performance is there... mainly it's heavy on the power consumption... but how many people will actually be bothered by that?). Nor is it the only product that nVidia needs to get by.
They have survived on their DX10 line in the face of AMD's offerings for quite a while now, and until AMD starts lowering prices, I don't see why that would change.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Adobe has already adopted Cuda for their products. Since AMD still doesn't deliver an OpenCL runtime to end-users, there's little point in supporting OpenCL anyway, as it would only run on nVidia hardware.
Adobe has also added OpenGL support to their products which the current 5800 series support at v4.0.

AMD is shut out.
With OpenCL they are about 3 years late to the party. In that time, nVidia has been able to promote Cuda freely to researchers and software vendors.
If you go back and read what I said I used the word "if." ATI Stream is actually older than Cuda. The difference is AMD/ATI did not have Nvidia's resources or marketing team. But that doesn't change the fact that Folding @ Home already support ATI Stream.

It's just a question of anyone else seeing value in adding or adopting OpenCL or Direct Compute. I'm pretty sure FireGL cards are cheaper than Nvidia's compeitor.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Adobe has also added OpenGL support to their products which the current 5800 series support at v4.0.

Yea, which nVidia supports aswell, so how is that a disadvantage for nVidia?
You can't compare it with Adobe's Cuda support, which doesn't, and won't, run on AMD hardware, ever.

If you go back and read what I said I used the word "if." ATI Stream is actually older than Cuda. The difference is AMD/ATI did not have Nvidia's resources or marketing team. But that doesn't change the fact that Folding @ Home already support ATI Stream.

That's exactly the point.
Stream was (and is) a steaming pile of crap (which I say as a Radeon owner/developer, sadly). Yes, there's Folding @ Home, but that's pretty much the only application that ever evolved in all those years (and it's not independent, ATi employees have worked on the code). The main reason is that it was just too damn painful to develop software with their SDK.
nVidia delivered a much better alternative with Cuda, which is why Cuda has been adopted by various developers, and actual commercial products are available on the market today.

It doesn't matter who was first... what matters is who was best, and so far that has been nVidia. AMD messed up... just like they were the first with GPU-accelerated physics, but messed up there aswell.

It's just a question of anyone else seeing value in adding or adopting OpenCL or Direct Compute. I'm pretty sure FireGL cards are cheaper than Nvidia's compeitor.

I don't think price is all that important to that market. Performance is probably going to be a bigger issue. That, and ease of development.
 

shangshang

Senior member
May 17, 2008
830
0
0
It's ridiculous that anyone is thinking NV is going to die. I acutally think NV's plan is to get out of the desktop gaming market eventually and be vested in the mobile/handheld graphic market. That's where the real graphic growth is. Desktop gaming has pretty much plateau.
 
Status
Not open for further replies.