Nvidia feels small compared to Intel/AMD

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
How much do you think it cost to produce a card with memory and pcb? A lot more than chip built onto the mobo using system ram like video ram. Think whatever you want. This is going to happen whether you believe or not.

AMD concept is called fusion for a reason.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Azn
How much do you think it cost to produce a card with memory and pcb? A lot more than chip built onto the mobo using system ram like video ram. Think whatever you want. This is going to happen whether you believe or not.

AMD concept is called fusion for a reason.

I never said it wasn't going to happen... I just said that it wasn't going to be this cheap gaming solution that some people are touting it to be. If it is performs well, you'll have your "premium" motherboards that include a mid range gpu that cost an arm and a leg, and you'll still have your standard crappy integrated graphics chips for people who don't even want/need (aren't willing to pay for) mid range graphics capabilities. It it doesn't perform well it will just be a new name for the same old crappy on board graphics.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
You don't know if it's going to cost arm and a leg or if it's going to perform like current onboard graphics. Intel is putting lot of money into this and bought out Kyro for this purpose and AMD bought ATI. So far onboard anything cost less than separate cards. that's reality.
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,948
126
honestly i think nvidia wll have to release video cards at cheaper prices. If the hi end is all they own then they will have to make the hi end appeal to more people.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Azn
You don't know if it's going to cost arm and a leg or if it's going to perform like current onboard graphics. Intel is putting lot of money into this and bought out Kyro for this purpose and AMD bought ATI. So far onboard anything cost less than separate cards. that's reality.

That is reality, but current Intel on board also can't touch even the low end discrete chips in terms of performance. This makes sense because they are designed to fill a different market niche. However, as you said, Intel is putting a lot of money into this... Hence, they will probably want a return on their investment, which they won't get if they sell their brand new on board chip for peanuts. Intel already has the greatest market share (by far) for integrated chips. What you are proposing would cannibalize their own market share because no body in their right mind would buy a GMA950 type chip anymore if you could get a much superior chip for only a few bucks more.

Another thing, look at Intel's history... They seldom have driven technology to give us cheaper products. Every time they drive technology forward, it costs us to keep up. They've pushed us towards more expensive (non-backwards compatible) technologies even when the performance benefit wasn't there. This was the case with Rambus, DDR2, PCIe, and now with DDR3. Even the Core2Duo chips weren't that cheap when they first came out, they were however worth the money and they dropped in price fairly quickly.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: AnonymouseUser
Nvidia needs to buyout or merge with Via. I think "NVia" could be very competitive with both AMD and Intel, especially in the mobile arena.
better yet, nvidia could merge with DAAMIT and create...um, I was never good at these word puzzles... never mind...

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Firstly, intel's lead in the total graphics market is being contended by nVIDIA. Want to see the numbers?

Nvidia is closing the gap for the number one position in graphics, AMD loses ground and Intel is flat

Growth at 82.9%. (intel at 3.1%) nVIDIA market share 32.6%. (intel at 19.5%). This is based on Q2 07.

Now tell me how is nVIDIA going to disappear within the next few years? To me thats just false. There will always be a graphics market whether you like it or not.

Some people have argued that once fusion and other CPU+GPU hybrids take off, it will spell doom for nVIDIA. Absolutely not! What people fail to realise is that while CPU/GPU hybrid offers alot of functionality for a given price due to the manufacturering cost of such chips, one has to think about the performance behind such product.

Fusion or any early CPU/GPU product isn't going to be perfoming at 2900XT levels, or even 8600GT levels or heck 8500GT levels. Why? such performance requires more complexity and i.e more transistors. Adding complexity and more transistors require more power, R&D, something to interconnect everything and need to solve problems such as "how is the bandwidth going to be shared between the GPU and CPU?" This in turn goes against the whole CPU+GPU being "cheap". What about the power? the heat? Not to mention a MCM package would not make much sense at an economical level.

(8600GT - G84 core uses 289M transistors. Compare this with Core 2 duo 291 million transistors).

Fusion and early hybrids (within the next couple of years) is simply targeting at the low end (for now) where the concept of CPU+GPU makes logical sense. Notebooks for one can benefit from this hugely. We are talking about GPU performance that is going to be a little if not similair to todays integrated GPUs such as nVIDIA's MCP73 or 7050 IGP or Intel's G33/35 or AMDs X1250 (690G). IT would literally take years or generations before a consumer level GPU/CPU hybrid could take on a high end discrete graphics card. (and at the rate GPUs are going now, i doubt this will happen within the next 5 to 10 years)

Talking about discrete graphics nVIDIA is already working on G100. AMD has its R700 project for 2008. Intel is set to release larrabee in 2009 and has already confirmed to "re-enter" the discrete GPU market.

On a side note, nVIDIA buying out VIA would be interesting. (merging would make NO sense at all since nVIDIA has the spare 2 billion dollars.cash.)
 

DrMrLordX

Lifer
Apr 27, 2000
22,932
13,015
136
I think Azn has a point, though we must keep in mind that HT3 and CSI will redefine what it means for a component to be integrated. Both HT3 and CSI should allow you to plug in coprocessors and/or add-in cards in much the same way that processors plug in to multi-socket SMP systems currently. It will make more sense for mobo manufacturers to sell systems with an extra processor socket (or in the case of AM3 boards, either extra sockets or HTX slots, or both) so that end-users and/or OEMs can plug in barebones GPUs that will utilize the main system's memory subsystem. It will work like an integrated GPU, but it'll be swappable for a new GPU down the line, and mobo costs will remain about the same as they are now.

Some folks might think that would make for some lousy video memory performance as is often the case with current integrated graphics solutions, but we need to keep in mind that the industry has already released DDR3 SDRAM that can be overclocked to DDR3-2000 speeds or run at stock speeds of DDR3-1666. It is not outside the realm of possibility that we could see DDR3-2500 or higher soon which should offer at least decent memory performance for a GPU attached via a CSI or HT link.

Alternatively, we could see octal core CPUs pushed out to market with hybrid core layouts. Noticed how many people have commented that we don't need 4, 8, or more general-purpose cores on anything but high-end number crunchers and servers? well, that may be true, to a point . . . but why not build 1-4 specialized GPU cores into the die, precluding the need for semi-integrated barebones GPUs? Let's face it, most folks out there are having trouble utilizing 4 general=purpose cores in their Q6600s as it is . . . if we see Intel continue to roll out cheap, advanced chips at low prices like a $266 octal-core chip a year from now, what do you think the average user would rather get in that octal-core package? 8 general-purpose cores, or 4 general-purpose cores plus 4 GPU/vector processing cores? OEMs would love it, researchers might be able to use the GPU cores for number crunching anyway (and might like them better than general-purpose cores), gamers would probably like it . . .

The fact remains that you will see the most powerful GPU solutions on their own cards w/memory + mem controller for some time to come, but it should be very easy and very cheap for companies like Intel and AMD to produce bare-bones add-in GPUs or multi-core CPU/GPU hybrids that would offer graphics performance that will be competitive with low and mid-range video cards in any give generation of GPUs. Nvidia makes a lot of money selling low-end and mid-range cards to end-users and OEMs which will be a lot of lost income if Intel and AMD eat up that market share with integrated or semi-integrated GPUs.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Have you ever thought about cost? Although the whole GPU socket idea isn't anything new and as been bought up even by nVIDIA, its not practical or cost effective. If it was, the industry would have made that move a long time ago.

DDR3-1600 costs more than a 8800GTX and the prices wont plummet any time soon.

How about bandwidth comparisons? An 8600GTS has 32Gb/s. DDR2-800 can provide up to 6.4gb/s. Now since its a CPU/GPU hybrid, how are you going to share this bandwidth? What sort of trade offs are you looking at? see my point?

note - DDR3-1600 offers a peak bandwidth of 12.80 GB/s. 8800ultra has bandwidth of 102 GB/s. You do the math,
 

AzN

Banned
Nov 26, 2001
4,112
2
0
That's why it's called a hybrid. We do not know how the memory bandwidth is going to be controlled. You are just touting PC architecture. Imagine xbox 360 style motherboard with built in GPU. You are just looking at what's available to you Cookie Monster. Wait until technology is released. We can speculate and argue all we want but we need to see for ourselves to determine if this is bad or good idea.

 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
Originally posted by: bryanW1995
Originally posted by: AnonymouseUser
Nvidia needs to buyout or merge with Via. I think "NVia" could be very competitive with both AMD and Intel, especially in the mobile arena.
better yet, nvidia could merge with DAAMIT and create...um, I was never good at these word puzzles... never mind...

I think "DAvid" is what you are looking for... :p
 

PingSpike

Lifer
Feb 25, 2004
21,758
603
126
Originally posted by: aka1nas
I've heard from others previously(Viditor?) that VIA and AMD's X86 licenses are non-transferable and wouldn't be usable if another company bought them out.

That doesn't really make sense...Via's license came from Cyrix. They purchased cyrix as a hedge against some lawsuit intel had against them, to gain access to the cross licensing agreements with intel.

There has been talk about this...of course the Via c3 isn't exactly a high performance chip. It'd be tough to turn it into a competitive player in the high performance seqment after it has been relegated to a low cost, low power niche for so many years.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I don't think that if NVIDIA bought VIA that is would be to use their technology, but more to gain access to the cross licensing agreements with Intel. I'm sure it would take NVIDIA some time to develop a good cpu. ...unless they have been working on one already.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: ayabe
Originally posted by: terentenet
I don't think Nvidia will go anywhere. They will remain on the market and they will survive. The market needs discrete graphics and that's where Nvidia comes in. There's no way an integrated GPU by Intel or AMD can match the power of a discrete GPU.
That's how Creative survives, even having the worst drivers....

Yeah but Creative is struggling massively and probably won't survive but a couple more years. Discrete sound chips are dead.

Discrete sound chips are dead? Try telling that to the people who work in a recording studio for a living. On board = BIG NO NO. also, ever hear of EAX? Only Creative chipsets have support for the latest variations.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: LOUISSSSS
Originally posted by: InflatableBuddha
This is a bit of a side issue, but can Nvidia make enough profit on graphics chips for consoles to offset losses in the PC market? Even though it has to compete with AMD/Ati in that market? What about profits Nvidia makes on graphics for PDAs and mobile phones?

I know high end PC graphics cards are a niche market, but I guess my main question is, can Nvidia's other products generate enough profit to offset losses in the PC market?

don't think so, iirc, isn't the WII gpu made by ATI? and isn't wii sales > xbox & ps3?

Once people realize the Wii has no games...maybe mom will buy a different system instead.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Azn
That's why it's called a hybrid. We do not know how the memory bandwidth is going to be controlled. You are just touting PC architecture. Imagine xbox 360 style motherboard with built in GPU. You are just looking at what's available to you Cookie Monster. Wait until technology is released. We can speculate and argue all we want but we need to see for ourselves to determine if this is bad or good idea.

I never said anywhere that it was a bad idea. It just doesn't make sense for mid/high end solutions for years to come.

Im looking at whats available now and within the next few years in terms of technology AND cost. Cost is the biggest factor with this concept.

You bring up a good point with the console based motherboards (similiar to notebooks). The only problem is that by doing something along those lines, you are severly limiting flexibility of the motherboard because everything is being hard wired to the motherboard. Whats the difference between cost in souldering ram onto the motherboad compared to fitting in a cheap DDR2-667? what kind of trade offs does this create? what if the ram doesn't work?

Cost. Cost. Cost. The MAIN point in CPU/GPU hybrid is cost and this is why its just so early to talk about such hybrid being able to take on mid range offerings from nVIDIA/AMD.

Its just not as easy as saying "lets just put a GPU on the motherboard and this will save alot of time, effort and money".

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nitromullet
Originally posted by: Azn
How much do you think it cost to produce a card with memory and pcb? A lot more than chip built onto the mobo using system ram like video ram. Think whatever you want. This is going to happen whether you believe or not.

AMD concept is called fusion for a reason.

I never said it wasn't going to happen... I just said that it wasn't going to be this cheap gaming solution that some people are touting it to be. If it is performs well, you'll have your "premium" motherboards that include a mid range gpu that cost an arm and a leg, and you'll still have your standard crappy integrated graphics chips for people who don't even want/need (aren't willing to pay for) mid range graphics capabilities. It it doesn't perform well it will just be a new name for the same old crappy on board graphics.
nitromullet, I'm pretty sure I've understood your point this entire time. What you're saying is that these new integrated GPUs will not be faster than high-end graphics cards, which means that in theory nVidia will still have some sort of marketshare. I see two potential problems with that:

1. The high-end market is perhaps 1%. Granted, they may have more leeway than that depending on how powerful Fusion turns out...HOWEVER:

2. AMD has stated that there will be several iterations of Fusion; one for gaming, one for office use, and another that's more balanced. What they are going to do is give more 'GPU cores' to the gaming chip, and more 'CPU cores' to the office chip. The thing is, due to the scalar nature of the GPU, they could just add 16 crappy graphics cores to the thing and suddenly it's a beast with an enormous ammount of shader power. :light:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Cookie Monster
Originally posted by: Azn
That's why it's called a hybrid. We do not know how the memory bandwidth is going to be controlled. You are just touting PC architecture. Imagine xbox 360 style motherboard with built in GPU. You are just looking at what's available to you Cookie Monster. Wait until technology is released. We can speculate and argue all we want but we need to see for ourselves to determine if this is bad or good idea.

I never said anywhere that it was a bad idea. It just doesn't make sense for mid/high end solutions for years to come.

Im looking at whats available now and within the next few years in terms of technology AND cost. Cost is the biggest factor with this concept.

You bring up a good point with the console based motherboards (similiar to notebooks). The only problem is that by doing something along those lines, you are severly limiting flexibility of the motherboard because everything is being hard wired to the motherboard. Whats the difference between cost in souldering ram onto the motherboad compared to fitting in a cheap DDR2-667? what kind of trade offs does this create? what if the ram doesn't work?

Cost. Cost. Cost. The MAIN point in CPU/GPU hybrid is cost and this is why its just so early to talk about such hybrid being able to take on mid range offerings from nVIDIA/AMD.

Its just not as easy as saying "lets just put a GPU on the motherboard and this will save alot of time, effort and money".

I never said you did. You however think what is available to you in ddr2 memory bandwidth form. Look at xbox for instance. How much do you think Microsoft actually paid to build when they were first released? Maybe $500 a little less? That's with a dvd drive, hard drive, power supply, case, cooling system, controller, cpu, and memory all in there. Imagine the price of just the motherboard? I've seen motherboards cost in the lines of $200-$250 like p35 for instance. Now x38 is it? Onboard is a lot cheaper than add in cards. It's just has always been. You are asking way too many "what if" questions just because you have doubt in your mind somehow it's going to fail miserably. If you don't think it's going to work you can always blame Intel and AMD later when they "actually FAIL". As of now you have no idea you just don't like the idea of change in the PC industry.
 

woolfe9999

Diamond Member
Mar 28, 2005
7,153
0
0
Eh, I think in the longrun, the fate of Nvidia is kind of a sideshow from the perspective of the consumer. Personally, I am unconcerned with Nvidia's "fears." The fact is, so long as there is a market for high end GPU's, there will be a supply, regardless of who supplies it.

I agree with an earlier poster who suggested that Nvidia will eventually be acquired. Indeed, I've considered that inevitable since AMD's acquisition of ATI. For Nvidia, my bet is that Intel will be acquirier. They have the cashflow to do it, and they have the intention to enter the GPU market strongly in the next few years. Why play catchup when they don't have to? Nvidia will sell out for their golden parachutes, and will become a division of Intel, probably retaining its current brand names.

The real issue that is scary for consumers is the fate of AMD. They're getting clubbed right now by Intel, and if if Intel acquires Nvidia, it's going to get worse. If AMD isn't around in a couple years, we'll have no competition in either the GPU *or* CPU markets. Now *that's* something for all of us to be concerned about.

- woolfe
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: SickBeast
Originally posted by: nitromullet
Originally posted by: Azn
How much do you think it cost to produce a card with memory and pcb? A lot more than chip built onto the mobo using system ram like video ram. Think whatever you want. This is going to happen whether you believe or not.

AMD concept is called fusion for a reason.

I never said it wasn't going to happen... I just said that it wasn't going to be this cheap gaming solution that some people are touting it to be. If it is performs well, you'll have your "premium" motherboards that include a mid range gpu that cost an arm and a leg, and you'll still have your standard crappy integrated graphics chips for people who don't even want/need (aren't willing to pay for) mid range graphics capabilities. It it doesn't perform well it will just be a new name for the same old crappy on board graphics.
nitromullet, I'm pretty sure I've understood your point this entire time. What you're saying is that these new integrated GPUs will not be faster than high-end graphics cards, which means that in theory nVidia will still have some sort of marketshare. I see two potential problems with that:

1. The high-end market is perhaps 1%. Granted, they may have more leeway than that depending on how powerful Fusion turns out...HOWEVER:

2. AMD has stated that there will be several iterations of Fusion; one for gaming, one for office use, and another that's more balanced. What they are going to do is give more 'GPU cores' to the gaming chip, and more 'CPU cores' to the office chip. The thing is, due to the scalar nature of the GPU, they could just add 16 crappy graphics cores to the thing and suddenly it's a beast with an enormous ammount of shader power. :light:

My point is simply that when these on chip/board things come out, if they are competitive performance wise they won't be cheap. The people that think that this is going to be some cheap entry into the mid range gaming market are going to be disappointed - either with performance or price. It simply doesn't make financial sense for either Intel, NVIDIA, or AMD to get rid of the lucrative mid range market segment and replace it with cheap on board gpus.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I don't think Intel wants Nvidia. I think Nvidia is too big for Intel's good. If they did they would have picked it up a while ago. Intel purchased former Kyro and is working on their own technology.
 

WelshBloke

Lifer
Jan 12, 2005
33,102
11,280
136
Ok people keep saying NV will be ok because there will always be a market for high end discrete video cards.

Thats true there probably will be a market for those cards.

What ,I guess, NV are worried about is their chipset sales, and mid to low end video card market being wiped out.

Without the revenue from these their going to have to cut back on a lot of R&D and that never works when your only market is on the bleeding edge of tech.
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
For nVIDIA to stay competitive they need to purchase VIA Technologies Inc.

EDIT: This would make it good for the consumer and allow nVIDIA to stay in the discreet graphics card segment and attack the CPU market. Three prominent companies offering GPU and CPU solutions would be great if you ask me :D
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nitromullet
My point is simply that when these on chip/board things come out, if they are competitive performance wise they won't be cheap. The people that think that this is going to be some cheap entry into the mid range gaming market are going to be disappointed - either with performance or price. It simply doesn't make financial sense for either Intel, NVIDIA, or AMD to get rid of the lucrative mid range market segment and replace it with cheap on board gpus.
Well then I suppose the point that the rest of us are making is that consumers will pretty much save the value of the bill of materials involved with graphics cards (aside from the GPU) with the new cores like Fusion.

Really, 512MB of DDR4 and an advanced PCB can't be all that cheap. I realize that the GPU will account for at least 50% of the BOM, but there should still be a savings of at least 30% if I had to take a guess.
 

WelshBloke

Lifer
Jan 12, 2005
33,102
11,280
136
Originally posted by: nitromullet

My point is simply that when these on chip/board things come out, if they are competitive performance wise they won't be cheap. The people that think that this is going to be some cheap entry into the mid range gaming market are going to be disappointed - either with performance or price. It simply doesn't make financial sense for either Intel, NVIDIA, or AMD to get rid of the lucrative mid range market segment and replace it with cheap on board gpus.


It very much makes sense for intel to get rid of the mid range market that they earn nothing from, and where intel goes AMD has to follow (and unfortunately nvidia has no way of following)