Nvidia GPUs soon a fading memory?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Aug 11, 2008
10,451
642
126
I am a fan of AMD, and would really like to see them beat intel or at least be competitive. However, the original post seems like it was written by the AMD marketing department on a high. Despite purchasing ATI at a very steep price, we have yet to see fusion and Intel already has graphics (admittedly lousy) integrated into the CPU. AMD will be lucky to survive and make a reasonable profit, much less put the hurt on Intel.

They do seem to have an edge on nVidia in the discrete gaming graphics market, but it remains to be seen if the gaming approach taken by ATI or the GPGPU approach of nVidia is the correct one. Unfortunately, the next generation of consoles could hurt PC gaming even more, and nVidia's attempt to utilize the GPU for computing may be the correct one.

If only AMD could be as competitive in the CPU market as in the GPU market, they would be in good shape. I would like to see it, but so far it hasn't happened.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Intel is losing some vital competitive edges and looks to start losing substantial market share and profits to AMD in 2011 and beyond and will need to get serious about acquiring graphics capabilities to help stop the bleeding.

The overwhelming majority of people today are fine with Intel integrated graphics. Those people that aren't fine with integrated solutions, aren't going to be fine with on package CPU solutions in the next ten years either. Moving integrated graphics from the mobo on to the CPU packaging is a cost cutting measure. There isn't going to be a major shift due to it in the PC space. The only markets that it becomes attractive to are those that require reduced power consumption over performance- small/ultra portable devices. Those markets aren't interested in C2D or Bulldozer at all.

ATI has already secured the next Xbox consoles as well as Nintendo's next console & hand held.

Based on every report and rumor I have seen, Nin's next handheld is Tegra2 based. We will find out for sure next month- but it is what every site I have seen has been reporting for a while now(and I read the console news updates every day too).

Yeah Nvidia is in for some lean years.

We keep hearing this, in the last year nV is up almost 100%. Don't underestimate how well run a business is to its' bottom line.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I think they will be.

Nvidia is looking at a world of hurt, within two years losing the ability to effectively compete in the notebook, low end and mainstream computer markets and most of the high end market as AMD's fusion line move to incorporates bulldozer and northern islands with the already potent on chip graphics able to seamlessly incorporate motherboard and discrete graphics as needed providing a compelling reason to buy AMD high end discrete graphics boards on AMD systems and providing an integrated computing enviroment neither Nvidia or Intel alone can effectively compete with.

Meanwhile, with Intel no longer able to use their 800 lbs. of gorillaness to strongarm OEM's, AMD racking up scads of notebook design wins and with their Fusion line-up in the 2011 wings looking ready to take it to the next level with Intel across the computing devices boards with newer, better fusion designs coming yearly and Globalfoundries looking increasingly able to decimate Intel's time and technology lead on future process nodes, Intel is losing some vital competitive edges and looks to start losing substantial market share and profits to AMD in 2011 and beyond and will need to get serious about acquiring graphics capabilities to help stop the bleeding.

There is only one company that has such resources and it is currently massively faceplanting ... becoming a low hanging and very juicy fruit ripe for the plucking.

The only real obstacle to that plucking would be the U.S. government, but AMD's resurgence and rosy competitive future might just provide argumentative fodder sufficient to overcome the government objections.

LOL, what is this?...a vision?...nV is far too big to suffer your rantings
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,328
9,711
136
If AMD can get ATI's SP's to do x86 code, at least FP operations, then they'll have a big step up on pretty much all the competition.

And NVIDIA isn't going anywhere for a long time. Say what you will about them, you cannot deny that they are a very competently run company with prowess and vision.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Imo the notebook and desktop markets are stagnant. The growth markets are smart phones, ebooks, tablets, netbooks, etc. Hence nvidia's tegra, intel's atom both looking to break into that market.

AMD have stayed concentrating very much on traditional markets - they have a good discrete gpu, then have decent cpu's and in fusion a budget cpu/gpu solution (although it's very late - coming out years after Intel managed it despite having a naff gpu division). This is ok today, but in the end will cost them - they needed to be in the low end portable market. If I were AMD I'd be scrambling to get involved there.

The other major area of change is gpu compute which is being pushed by nvidia. This is inevitably going to change the balance of cpu/gpu. All super computers are probably going to have powerful gpu's along with their cpu's. The server and workstation market will come next, and games as soon as the next gen consoles (which will support this) arrive.

These super computer and workstation markets aren't growing, but nvidia gpu's may well displace a lot of AMD/Intel cpu's with their gpu's. AMD can at least compete here, although both software and hardware are well behind nvidia right now. Intel has nothing.

Hence I suspect nvidia will loose a little of their traditional market, but gain a lot in both the portable and super computer/server markets. The will probably do fine.

Right now it's looking bleaker for AMD really - they will gain in the desktop/laptop market slightly, but it's a shrinking pot. There isn't much money to be made. They will loose in the super computer market, and loose out big time in the portables. tbh I expect AMD to be falling back into their troubles with debt, and fighting to survive once again.

Intel have such a strong base they will be ok, but importantly the world is moving away from x86. The portables are well entrenched with arm and show no sign of changing. The x86 argument was always that all the software is X86 so you must use x86. Well in portables all the software is arm, Intel aren't going to manage to make x86 dominant there - it's too late. Equally at the very high end gpu compute will not be x86 there either. Intel had their shot with larrabee, they failed. It is currently cuda, and it will move to opencl/direct X compute. None of these are x86 only (or at all in fact). Hence imo Intel will survive, but no longer be the dominant player they once were.

Microsoft seem to be failing to shift anything but windows (traditional market again). The big winners here seem to be apple and google who have both been much more pro-active and effective. Hence microsoft decline a bit, apple/google grow.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Nvidia isn't going anywhere anytime soon- they'll need several sustained bad years with no profit to be in real trouble. They have lost their motherboard chipset business which is a loss but in the grand scheme of things Nvidia is and probably always will be a discrete graphics company. I can definitely see AMD taking away market share from Intel with Fusion but I doubt this will impact much upon Nvidia (which makes it's money in discrete sales). AMD is also making leaps and bounds in laptop sales and market share which is a trend that looks set to continue with their current focus on performance per watt.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
If AMD can get ATI's SP's to do x86 code, at least FP operations, then they'll have a big step up on pretty much all the competition.

Ofcourse not.
x86 code doesn't take advantage of the massive parallelism of a GPU at all.
And a single FP unit on a GPU is nowhere near as fast as an x86 unit.
The only right solution is to extend the x86 instructionset to include ultra-wide SIMD instructions... which is exactly what Intel's Larrabee is about.
Obviously this requires all x86 code to be rewritten in order to take advantage of it... so it doesn't really matter anyway. Might aswell rewrite it with OpenCL, Cuda or DirectCompute while you're at it. Cuda being the best option there, since it supports regular C/C++, just like the x86, where the others have a more shader-like limited C-ish dialect.
So nVidia knew what they were doing.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Right now it's looking bleaker for AMD really - they will gain in the desktop/laptop market slightly, but it's a shrinking pot. There isn't much money to be made. They will loose in the super computer market, and loose out big time in the portables. tbh I expect AMD to be falling back into their troubles with debt, and fighting to survive once again.

Indeed.
AMD's problem is that the GPU market is orders of magnitude smaller than the CPU market.
No matter how well their GPU division does, it cannot support the company when the CPU market is running at a loss.
And currently, AMD's CPU division isn't looking very good compared to Intel. They're going to be booted out of the super computer market by Nehalem and its derivatives, and they are currently clinging onto the desktop market by selling six-cores to compete with Intel's 2-year old quadcore line. That is only going to hurt more when mainstream 32 nm and Sandy Bridge arrive later this year.

Therefore I also suspect that if AMD really gets into trouble with their CPU division, they're going to have to do the only smart business move they can, and reallocate the GPU division's resources towards the CPU division.
So in a way, nVidia's future depends on Intel CPUs.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It's ridiculous that anyone is thinking NV is going to die. I acutally think NV's plan is to get out of the desktop gaming market eventually and be vested in the mobile/handheld graphic market. That's where the real graphic growth is. Desktop gaming has pretty much plateau.

I have the same belief. Especially when you see where desktop graphics is going with AMDs Fusion and Intels Larrabee. Basically good enough integrated graphics. The discrete graphics market has been nearly cut in half in the past 24 months. Mobile markets are huge and will only get bigger as they put more capability into phones. I think the next big revolution will be actual gaming on those devices.

I think Nvidia will contonue to make discrete graphics to fund their move into these other markets. But it wouldnt surprise me at all in 10 years to not have a discrete market and Nvidia is out completely.
 

dzoner

Banned
Feb 21, 2010
114
0
0
Nvidia isn't going anywhere anytime soon- they'll need several sustained bad years with no profit to be in real trouble. They have lost their motherboard chipset business which is a loss but in the grand scheme of things Nvidia is and probably always will be a discrete graphics company. I can definitely see AMD taking away market share from Intel with Fusion but I doubt this will impact much upon Nvidia (which makes it's money in discrete sales). AMD is also making leaps and bounds in laptop sales and market share which is a trend that looks set to continue with their current focus on performance per watt.

In what alternate universe? In the current market one bad year with no profit and they're in a world of hurt. THEY DON'T HAVE AN 86 LICENSE. And Fusion and Intel's fusion chips WILL kill Nvidia's OEM graphics markets.

Fusion is designed to replace motherboard and OEM discrete gpus.

AMD"s first generation Fusion chips will span the market from sub 1 watt chips to a top end part AT LEAST comparable to an Athlon II X4 635 + HD 5670 combination.

That covers 95% of Nvidia's entire OEM market, which is Nvidia's bread and butter, not the discrete market.

The NEXT fusion generation coming in 2012-13 will incorporate bulldozer and Northern Islands and be hotly competitive in every computer market that exists, from smart phones to supercomputers.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Fusion is designed to replace motherboard and OEM discrete gpus.

Only for the low end.
Because you share the memory controller and main memory with the CPU, it is impossible to make a high-end GPU this way.
High-end laptops have GPUs with their own memory controller and GDDR, which delivers oodles more bandwidth than the main memory, and doesn't have to share with the CPU either.
This is pretty much mutually exclusive with the Fusion concept, and as such I don't see Fusion ever threatening high-end GPUs, either for laptops or desktops.
 

dzoner

Banned
Feb 21, 2010
114
0
0
If AMD can get ATI's SP's to do x86 code, at least FP operations, then they'll have a big step up on pretty much all the competition.

And NVIDIA isn't going anywhere for a long time. Say what you will about them, you cannot deny that they are a very competently run company with prowess and vision.

Where this is headed is pretty clear, discrete cpus and gpus going away and fully integrated fusion chips taking their place within 5 years.

AMD has flatly said Fusion IS it's future, and since optimizing drivers to make make best use of the gpu is important to fusion's success, it's pretty likely AMD is putting some serious resources into it's programming division to make sure substantial progress has been made on that front by the time Fusion goes public.

By the time Fusion 2 arrives in 2012-13, with bulldozer and NI cores, AMD should have a pretty complete, polished and competitive set of apu solutions in hand, from handhelds to supercomputers.

Intel will be crippled on the gpu part of an APU.
Nvidia will be crippled on the cpu part of an APU.

There's a reason JHH took a leap off a cliff with both fingers crossed with Fermi - a new massive and complex architecture on a new and troubled process. He didn't consider he had the TIME to do anything else and stay competitive. Unfortunately for him he broke both legs and an arm when he landed.

That hurt.

Nvidia will not survive to 2015. Chances are Intel will absorb it to try and catch up with AMD's considerable graphics lead plus leapfrogging AMD with CUDA. The sooner the better for Intel. I would be very surprised if Intel didn't make Nvidia shareholders an offer they couldn't refuse by the end of the year.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
If AMD has been able to survive for the past 15 years of blood letting. Nvidia with a superior product and lots of money will be around in 5 years lol.

Nvidia just might look like a different company from a product portfolio.

Did AMD build a new handheld division or something? They sold their's off about 15 months ago to Qualcomm.

And with what are they going to compete in the supercomputer realm?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I think they will be.

Nvidia is looking at a world of hurt, within two years losing the ability to effectively compete in the notebook, low end and mainstream computer markets and most of the high end market as AMD's fusion line move to incorporates bulldozer and northern islands with the already potent on chip graphics able to seamlessly incorporate motherboard and discrete graphics as needed providing a compelling reason to buy AMD high end discrete graphics boards on AMD systems and providing an integrated computing enviroment neither Nvidia or Intel alone can effectively compete with.

Meanwhile, with Intel no longer able to use their 800 lbs. of gorillaness to strongarm OEM's, AMD racking up scads of notebook design wins and with their Fusion line-up in the 2011 wings looking ready to take it to the next level with Intel across the computing devices boards with newer, better fusion designs coming yearly and Globalfoundries looking increasingly able to decimate Intel's time and technology lead on future process nodes, Intel is losing some vital competitive edges and looks to start losing substantial market share and profits to AMD in 2011 and beyond and will need to get serious about acquiring graphics capabilities to help stop the bleeding.

There is only one company that has such resources and it is currently massively faceplanting ... becoming a low hanging and very juicy fruit ripe for the plucking.

The only real obstacle to that plucking would be the U.S. government, but AMD's resurgence and rosy competitive future might just provide argumentative fodder sufficient to overcome the government objections.

You seems to be 15 years behind my friend. Intel I5 661 is a CPU that contains GPU, and no good at both performance and sales. Before Intel, Nvidia had release on board video that actually scale with PCIE video cards, called hybrid. After that, Nvidia had created Optimus which allows the discrete video card to be shut down completely while not needed and seamlessly switch back on.

At first, the fusion idea was sounding, but it is clear that it means absolutely nothing to high-end/mainstream user. The fact that a video card draws more power and produce more heat than a CPU indicates that you really can't put a high-end CPU along with a high-end GPU together under one die. That means, while it may become a good solution for mobile or any tiny PC where space is important, it is not going to be something worth talking about in the high-end market.

Has anyone talk about Nvidia's hybrid tech? Has anyone talk about Intel's I5 661?

As of now, Intel is trying to get into the mobile/smartphone market and Nvidia had Tegra, and Tegra 2 is on the way. These chips draw much less power and therefore don't need a huge heatsink. Nvidia is trying to use GPU to do what CPU does, Intel tries to have GPU to offload graphics from CPU, and AMD is somewhere in between the 2. As of now, I don't think AMD has nothing special on those fields. Fusion will be a good start, for AMD.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Where this is headed is pretty clear, discrete cpus and gpus going away and fully integrated fusion chips taking their place within 5 years.

As I said in my post (which you apparently ignored because it doesn't suit your agenda), un-possible!

AMD has flatly said Fusion IS it's future, and since optimizing instructions to make make best use of the gpu is important to fusion's success, it's pretty likely AMD is putting some serious resources into it's programming division to make sure substantial progress has been made on that front by the time Fusion goes public.

What programming division? They can't even get decent OpenCL on the market, they failed at GPU physics... and now suddenly they can make Fusion a success? That's a much bigger task than OpenCL or GPU physics. Something that is difficult to pull off anyway, if you're not the biggest CPU manufacturer. Need I remind you of 3DNow!? Nice technology at the time, better than what Intel offered... But it never took off because there were too few products on the market to support it.

Intel will be crippled on the gpu part of an APU.

Firstly, I don't think performance is an important factor in that market. As long as you can do basic desktop and video acceleration and such (which Intel can already do with their current products), it's fine.
Secondly, Intel is still working on Larrabee, so it's not like they're not trying to develop more powerful GPU technology for their future products.

Nvidia will be crippled on the cpu part of an APU.

Which may or may not be a problem for nVidia's future. As mentioned, there are plenty of other markets around.

Nvidia will not survive to 2015. Chances are Intel will absorb it to try and catch up with AMD's considerable graphics lead plus leapfrogging AMD with CUDA. The sooner the better for Intel. I would be very surprised if Intel didn't make Nvidia shareholders an offer they couldn't refuse by the end of the year.

Intel rarely buys other companies. That is not their style (Not-Invented-Here). Perhaps Intel will recruit some of nVidia's engineers (and take over the IP portfolio), but I think they prefer to stick with Larrabee and make that a success.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Wait a minute, arent you the same "Scali" that wrote that thing at Theo's BSN entitled "Are all AMD fans - idiots?". Well, looks some time passed by and found you with an DAAMIT idiot video card, writing code in AMD's shitty unique GPU/CPU OpenCL kit, congratulations, you're passed your elitist bias. As for Intels Larrabee, 1 thing is for certain, it aint gonna run nor support CUDA. Now take a wild guess who's gonna be HPCs darlings for the next decade, it sure aint NV nor its CUDA.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Wait a minute, arent you the same "Scali" that wrote that thing at Theo's BSN entitled "Are all AMD fans - idiots?". Well, looks some time passed by and found you with an DAAMIT idiot video card, writing code in AMD's shitty unique GPU/CPU OpenCL kit, congratulations, you're passed your elitist bias.

I'm not biased, never was... and I'm not an AMD fan. This isn't the first AMD/ATi product I've owned either. I still stand by every word of that article. I guess you just didn't understand it (this topic alone makes another fine case for the article: AMD idiot fanboy proclaiming the death of nVidia).

As for Intels Larrabee, 1 thing is for certain, it aint gonna run nor support CUDA. Now take a wild guess who's gonna be HPCs darlings for the next decade, it sure aint NV nor its CUDA.

Larrabee will support both OpenCL and C++ code.
nVidia does the same through Cuda.
So it doesn't really matter.
 
Status
Not open for further replies.