Nvidia GPUs soon a fading memory?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Intel rarely buys other companies. That is not their style (Not-Invented-Here).

Not true, http://www.alacrastore.com/mergers-acquisitions/Intel_Corporation-503434

These big companies buy small companies for IP all the time, you just don't really hear about it because it's not huge news. It's true they don't buy big companies like nVidia often, but large acquisitions like that are rare for all businesses.

Scali said:
this topic alone makes another fine case for the article: AMD idiot fanboy proclaiming the death of nVidia

Perhaps the article should've just been called 'Are fanboys idiots?' instead of 'Are all AMD fans idiots?' :p
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Not true, http://www.alacrastore.com/mergers-acquisitions/Intel_Corporation-503434

These big companies buy small companies for IP all the time, you just don't really hear about it because it's not huge news. It's true they don't buy big companies like nVidia often, but large acquisitions like that are rare for all businesses.

For IP, yes (I mentioned that in my post, see)... but as I said, they're not going to take over nVidia's GPU business in favour of their own.
The IP thing is just a strategic thing... if NV is up for sale, Intel can't afford not to buy the IP... if Intel doesn't, then AMD will, and Intel will end up having to pay lots of licensing fees to AMD for every GPU they make (nVidia owns a LOT of IP, from their own work, and acquisitions of others, most notably the former giant 3dfx).

Perhaps the article should've just been called 'Are fanboys idiots?' instead of 'Are all AMD fans idiots?' :p

In theory it could...
Problem is, I had this huge collection of examples from AMD fanboys...
Other fanboys are nowhere near as productive when it comes to myths, FUD, fabrications, etc... except perhaps for the linux crowd, but I already honoured those in a previous article.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
The NEXT fusion generation coming in 2012-13 will incorporate bulldozer and Northern Islands and be hotly competitive in every computer market that exists, from smart phones to supercomputers.

Do you have any links for the use of Fusion in Supercomputers?
 
Last edited:

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Can someone explain to me why Nvidia is so strong in HPC?

Is this partly because of software advantages and/or customer support?

What sort of hardware/software advantage would AMD need in order to make inroads into HPC?

My understanding is that NVs arch is more programmer-friendly to work with, and NV puts forward a lot more of their own resources working with third parties
 

dzoner

Banned
Feb 21, 2010
114
0
0
I'm not too sure about nVidia losing out in the notebook market...
AMD's fusion/platform may lock nVidia out on the AMD side... but the Intel side is larger, and I don't really see Intel delivering high-end solutions yet, so they need nVidia (and/or AMD) for that market.

Then there's Tegra, as mentioned...
And Fermi... could be turned into a success story with the next die-shrink and/or refresh.

Ofcourse Fermi also has a GPGPU-side to it, where I think it really has no competition from AMD, partly because Fermi has incredibly strong performance, especially in double-precision arithmetic, but also because nVidia has a more mature development environment, with Cuda/OpenCL.

Another area where nVidia may have good chances is when the next generation of consoles is coming up, which should probably not be more than 2-3 years away.

So we'll have to see... As mentioned above, nVidia has been profitable so far, so there is still plenty of time to fine-tune their product line and design a new strategy for the coming years, if need be.
Perhaps nVidia will not be the same company as we know it today, but I don't see them disappearing anytime soon.

What Nvidia doesn't have is plenty of time. There's a reason JHH tried to shoehorn a brand new and massive architecture into a brand new fabication node. While at the time he made that decision he was doubtless considering Larrabee a much larger and closer threat to his gpgpu plans than it turned out to be, there was also Fusion on the horizon, which he knew he had no answer for in the OEM notebook and desktop market. Intel and AMD CONTROL the 86 market and he knew it was only a matter of time before he was shut out of all but the discrete gpu market. Which is exactly what is happening. In reality, it is Fusion that is also going to be the biggest near term threat those gpgpu plans.

Markets can turn sour VERY fast when you become price uncompetitive. Look at Nvidia's discrete graphics market position change in just two years. If Fusion is successful in the notebook market, providing performance at a price Intel/Nvidia can't match, that market can turn sour for nvidia in a year because THEIR OEM market is in higher end notebooks that need better than Intel graphics. Similar scene with the cutthroat entry level and mainstream desktop markets where Fusion's price/performance advantage can make a big enough difference that AMD could grab substantial market share in a years time.

That CUDA edge Nvidia has can dissappear pretty quick if Nvidia starts having severe cash flow problems and has to start laying people off and AMD starts (or likely already is) putting substantial resources into and focus on developing open cl and open gl etal in preparation for Fusion's release. With AMD having stated Fusion IS their future path, it would be unlikely they are still neglecting developing the programming end of the plan.

Why would Nvidia fare any better with consoles. The same advantages Fusion brings to mobile and desktop applications, it would also bring to consoles, and being able to tailor a gpu/cpu mix to match a particular console specification.

The major publishers are undoubtably pressing for next generation consoles that are far easier to port to and from. If so, that doesn't favor Nvidia.

Then there's Apple taking a close look at Fusion. That could be a major hurt for Nvidia.

Fusion is just an incredibly potent computing solution. AMD has to be able to execute on it, but the track record there has been pretty stellar of late.

Nvidia AND Intel will be hard challenged by it.
 

dzoner

Banned
Feb 21, 2010
114
0
0
Do you have any links for the use of Fusion in Supercomputers?

Just logic. The purpose of Fusion is to ever more closely fuse the cpu and gpu functions over time, essentially what Fermi is trying to accomplish, only with an 86 license advantage. It's a natural fit for supercomputers.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Markets can turn sour VERY fast when you become price uncompetitive. Look at Nvidia's discrete graphics market position change in just two years.

Uhh, what changed?
If you only read nerd forums, then yea, you may believe that AMD's Radeon 4000/5000 series made a difference... However, if you look at actual sales figures, nVidia still outsells AMD, even if it is mostly on brand recognition (reminds me of the Pentium 4... all the nerd forums were making a big deal out of it, but the Pentium 4 sold just fine and Intel turned in good business results year after year).
If you take a look at Steam: http://store.steampowered.com/hwsurvey
nVidia still at 61.88%, more than twice AMD's share of 30.92%.
nVidia really has no reason to panic at the moment.

With AMD having stated Fusion IS their future path, it would be unlikely they are still neglecting developing the programming end of the plan.

AMD/ATi have stated a lot of things over the years.
Just a few years ago, they were claiming that Barcelona would be the most powerful x86 CPU on the market... 40% faster than Clovertown!
http://www.youtube.com/watch?v=G_n3wvsfq4Y
We all know how that went.
AMD may WANT Fusion to be a success, but that doesn't mean they are able to pull it off. They don't have the resources.

In the meantime, Intel is already doing 'Fusion', while AMD has been talking about it for the past 3-4 years, with nothing to show for it.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
If you take a look at Steam: http://store.steampowered.com/hwsurvey
nVidia still at 61.88%, more than twice AMD's share of 30.92%.
nVidia really has no reason to panic at the moment.

And if you look closer you'll see the lion's share of that 61.88% is 8 series cards, further boosted by 9800 and 250 numbers. Which means most were not bought in 2009. Aside from obvious problems of drawing any valid conclusions from Steam you could also interpret the number of 4-series cards at 10% being waay higher than the number of 2[6789][05] series cards as ATI outselling NV 2 to 1 in the last generation in the mainstream and enthusiast segments.

Nobody denies NV started kicked the living crap out of ATI's ass starting with the 6 series, continued the booty punting with the 7 series and absolutely positively undeniably dominated with the 8 series.

The question raised by this thread is: what have they done for us lately, and are they likely to continue as leaders in the future? Currently signs point to "no."
 

Scali

Banned
Dec 3, 2004
2,495
0
0
And if you look closer you'll see the lion's share of that 61.88% is 8 series cards, further boosted by 9800 and 250 numbers. Which means most were not bought in 2009.

He said '2 years'. 9800 and 250 most certainly have been bought in the last 2 years, 9800 was introduced in March 2008, so it hasn't been on the market for much more than 2 years.

Aside from obvious problems of drawing any valid conclusions from Steam you could also interpret the number of 4-series cards at 10% being waay higher than the number of 2[6789][05] series cards as ATI outselling NV 2 to 1 in the last generation in the mainstream and enthusiast segments.

The thing is, if people massively flocked to AMD in the past 2 years, nVidia wouldn't still be showing a ~62% share on Steam today.
It's not a secret that the 4-series isn't that big in sales volume... point is just that a lot of people have continued to buy nVidia's DX10 products, despite AMD's alternative offerings.

The question raised by this thread is: what have they done for us lately, and are they likely to continue as leaders in the future? Currently signs point to "no."

Yea, but that's the nerd forum factor.
My point was that it doesn't really matter that nVidia currently doesn't have the most attractive offerings. The general public is more at home with the nVidia brand, so that's what they buy, just like how Pentium 4 outsold Athlons by a huge margin, simply because it had the Intel brand on it.
Enthusiasts really don't have much of an effect on the market...
And Steam already includes quite a few enthusiasts, or at least people who are more hardware-centered than the average PC buyer, as games generally demand high-performance graphics cards.
 

dzoner

Banned
Feb 21, 2010
114
0
0
AMD may WANT Fusion to be a success, but that doesn't mean they are able to pull it off. They don't have the resources.

In the meantime, Intel is already doing 'Fusion', while AMD has been talking about it for the past 3-4 years, with nothing to show for it.

They are currently sampling working Llano chips to selected partners.

Making the above assertions in light of that fact points to an abyssmal lack of rationaly in your arguments.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
They are currently sampling working Llano chips to selected partners.

Making the above assertions in light of that fact points to an abyssmal lack of rationaly in your arguments.

Fusion sounds like an integrated graphics product. How is that going to be attractive to the HPC crowd? Considering in many compute applications AMDs top GPU gets smoked by Nvidias 480. I dont see why anybody would be interested in using a lesser performing GPU for the same application. And what is AMD going to sell in the handheld market?
 

Edgy

Senior member
Sep 21, 2000
366
20
81
Merge of CPU & GPU on one chip provides relatively MINOR advantage in power and cost savings and therefore chips like Clarksdales and first iteration of Fusion like AMD’s Liano chips are not very interesting except as maybe HTPC solutions or laptop chips. Hardly any impact to Nvidia as they don’t sell too many IGP solutions anymore.
The actual on die CPU/GPU integration with shared instruction set enabling direct access to GPU via extension of x86 instruction is where things are headed with AMD (& probably with Intel and Larabee). With this, assuming they are successful, it can seamlessly address the increasing applications that require more and more parallel processing with 1 chip.
Imagine a chip with 4 GPU cores with 400 SP each and 4 CPU cores. Provided a successful integration design on die and x86 instruction set to access & address the more appropriate (GPU or CPU) cores based on need, such a chip would be fairly balanced for gaming. How about 10 GPU cores & 2 CPU cores? voila - GPGPU via x86 instruction set extension.
How about a chip with 3 CPU cores + 1 GPU core (200SP) designed for low power laptops. How about 1 CPU core + 1 GPU core (80SP) for very low power tablet or even a cell phone?
Nvidia has taken the most logical path possible with its GPGPU strategy & solution as they only excel in graphics side. But lacking the CPU side of the equation, this is a solution for a niche market at best (specialized parallel servers/supercomputers).
Having said all that - discrete graphics market isn't going to disappear any time soon let alone by 2011 (yes BD is set to launch 2011 and maybe with some versions with GPU on chip but they won’t have a fully integrated or “fusioned” BD until much much later than 2011 and Intel’s initial bumbling of Larrabee delayed their plans). That alone can guarantee Nvidia's business (Fermi while late & hot is still very competitive) let alone tegra2, GPGPU, etc.,
When it comes right down to it, assuming there is no total implosion due to incompetent R&D execution & continually poor product development, this business is about marketing and sales. I think most of us agree that both Intel & Nvidia can run circles around AMD and their marketing strategies… at least enough to ensure us that Nvidia will be around for a while. Also in a couple of years, who knows how situation may change for Nvidia? (maybe even a CPU whether it is x86 or not?)
 

dzoner

Banned
Feb 21, 2010
114
0
0
Merge of CPU & GPU on one chip provides relatively MINOR advantage in power and cost savings and therefore chips like Clarksdales and first iteration of Fusion like AMD’s Liano chips are not very interesting except as maybe HTPC solutions or laptop chips.

And you know this how?

Considering no such chips yet exist in the market.
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,948
126
hai guys.

After fermi they can get into the toaster business.

BOO-YAA ZING!
/out
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
They still have a top notch marketing department adept at milking vacuum and a loyal, rabid fan base. They're not going anywhere any time soon.

I love this quote! "...adept at milking vacuum and a loyal, rabid fan base."
 

dzoner

Banned
Feb 21, 2010
114
0
0

"We don’t get on-die graphics yet because Intel still hasn’t switched over to its make-everything-at-the-best-process-ever strategy. The 32nm fabs are ramping up with CPU production and the 45nm fabs need something to do. Nearly every desktop and laptop sold in 2010 will need one of these 45nm GMA die, so the fabs indeed have something to do."

Edgy - "Merge of CPU & GPU on one chip provides relatively MINOR advantage in power and cost savings and therefore chips like Clarksdales ..."

chip = die = chip

Clarksdale is 1 32nm cpu chip + 1 45nm graphics chip on one processor package.

Fusion is one 32nm chip with both cpu and graphics cores on die.

ergo - there is not yet a released 'Merge of CPU & GPU on one chip' part.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
They are currently sampling working Llano chips to selected partners.

We were talking about making it a success, requiring a lot of SOFTWARE support.
No doubt they can copy-paste an IGP onto a CPU, Intel's already shown them how it's done.
But that has NOTHING to do with pushing nVIdia out of the DISCRETE market...
You need a LOT more than that.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
ergo - there is not yet a released 'Merge of CPU & GPU on one chip' part.

I don't think that matters much for the argument...
Namely, although the GPU is 45 nm, we are comparing against non-integrated CPU + IGP solutions, where all CPUs are 45 nm aswell, and IGPs sometimes even more than 45 nm.
So if anything, the 32 nm CPU would skew the comparison in Clarkdale's advantage, as we don't have any non-integrated 32 nm solutions to compare to.
Despite that, the difference isn't very large.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
"We don’t get on-die graphics yet because Intel still hasn’t switched over to its make-everything-at-the-best-process-ever strategy. The 32nm fabs are ramping up with CPU production and the 45nm fabs need something to do. Nearly every desktop and laptop sold in 2010 will need one of these 45nm GMA die, so the fabs indeed have something to do."

Edgy - "Merge of CPU & GPU on one chip provides relatively MINOR advantage in power and cost savings and therefore chips like Clarksdales ..."

chip = die = chip

Clarksdale is 1 32nm cpu chip + 1 45nm graphics chip on one processor package.

Fusion is one 32nm chip with both cpu and graphics cores on die.

ergo - there is not yet a released 'Merge of CPU & GPU on one chip' part.

Where did I hear that before?
Oh, yeah...AMD's FUD about "Native Quadcore" :hmm:

How did that turn out? :D
 

A_Dying_Wren

Member
Apr 30, 2010
98
0
0
What Nvidia doesn't have is plenty of time. There's a reason JHH tried to shoehorn a brand new and massive architecture into a brand new fabication node. While at the time he made that decision he was doubtless considering Larrabee a much larger and closer threat to his gpgpu plans than it turned out to be, there was also Fusion on the horizon, which he knew he had no answer for in the OEM notebook and desktop market. Intel and AMD CONTROL the 86 market and he knew it was only a matter of time before he was shut out of all but the discrete gpu market. Which is exactly what is happening. In reality, it is Fusion that is also going to be the biggest near term threat those gpgpu plans.

Markets can turn sour VERY fast when you become price uncompetitive. Look at Nvidia's discrete graphics market position change in just two years. If Fusion is successful in the notebook market, providing performance at a price Intel/Nvidia can't match, that market can turn sour for nvidia in a year because THEIR OEM market is in higher end notebooks that need better than Intel graphics. Similar scene with the cutthroat entry level and mainstream desktop markets where Fusion's price/performance advantage can make a big enough difference that AMD could grab substantial market share in a years time.

That CUDA edge Nvidia has can dissappear pretty quick if Nvidia starts having severe cash flow problems and has to start laying people off and AMD starts (or likely already is) putting substantial resources into and focus on developing open cl and open gl etal in preparation for Fusion's release. With AMD having stated Fusion IS their future path, it would be unlikely they are still neglecting developing the programming end of the plan.

Why would Nvidia fare any better with consoles. The same advantages Fusion brings to mobile and desktop applications, it would also bring to consoles, and being able to tailor a gpu/cpu mix to match a particular console specification.

The major publishers are undoubtably pressing for next generation consoles that are far easier to port to and from. If so, that doesn't favor Nvidia.

Then there's Apple taking a close look at Fusion. That could be a major hurt for Nvidia.

Fusion is just an incredibly potent computing solution. AMD has to be able to execute on it, but the track record there has been pretty stellar of late.

Nvidia AND Intel will be hard challenged by it.

But isn't there a very valid reason for splitting up the GPU and the CPU to distribute the heat around the laptop more evenly? I'm fairly sure you're going to have one massive hot spot on laptops which is not going to be particularly appreciated if you have power equivalent to what nvidia churns out right next to AMDs less efficient cpus.

AMD (not ATI) hasn't really had a stellar track record of late. They've been deeply relegated to budget computing on normal desktops ever since C2D. They only really compete in the server arena because server software is considerably more multi-threaded so AMD can go crazy on cores. I haven't seen any proper reviews of their latest notebook cpus but I'm quite willing to bet they're not up to scratch with Intel's stuff.

ATI has done well since the 4xxx series for I think the same reasons Nvidia will in the future do well with fermi. Both the 2xxx and 4xx series were hot and loud (at least the 4xx series gave very decent performance) initially. The 2xxx series became amazing with die shrinks. Fermi might too.

And I think you overplay this "cost/performance" stuff. Its fairly clear such a ratio has little in common with the success of a laptop/desktop out of the enthusiast realm. Its all marketing and vendor relations and Nvidia and Intel have that down pat.

The real danger to Nvidia I think is if Intel's graphics unit actually became competitive which isn't too likely. And no I don't think Larrabee is anywhere in the mind of JHH. Intel hasn't even so much as churned out a valid (current) roadmap with Larrabee on it AFAIK.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
The real danger to Nvidia I think is if Intel's graphics unit actually became competitive which isn't too likely. And no I don't think Larrabee is anywhere in the mind of JHH. Intel hasn't even so much as churned out a valid (current) roadmap with Larrabee on it AFAIK.

Intel themselves always indicated that they were aiming for mainstream performance with Larrabee, initially.
So they never really gave nVidia or AMD any reason to worry all that much. Not even Intel was expecting to get anywhere near the performance crown.

I think if there's anything to worry about, it's the fact that both AMD and nVidia depend on TSMC manufacturing. We've seen with both the Radeon 5000-series and the nVidia 400-series, that TSMC has not been very reliable of late. They didn't have their 40 nm process in order at all (and even today, 5000-series cards are relatively scarce, so it's probably still not working all that well).
Intel will probably never suffer manufacturing issues with GPUs, since they have sorted out their manufacturing with their CPU product line already.
So if this situation happens again while Intel has a GPU on the market, perhaps Intel can make it a success simply because they're the only ones that are actually available (and without price inflation).
 
Status
Not open for further replies.