Nvidia feels small compared to Intel/AMD

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: woolfe9999
Eh, I think in the longrun, the fate of Nvidia is kind of a sideshow from the perspective of the consumer. Personally, I am unconcerned with Nvidia's "fears." The fact is, so long as there is a market for high end GPU's, there will be a supply, regardless of who supplies it.

I agree with an earlier poster who suggested that Nvidia will eventually be acquired. Indeed, I've considered that inevitable since AMD's acquisition of ATI. For Nvidia, my bet is that Intel will be acquirier. They have the cashflow to do it, and they have the intention to enter the GPU market strongly in the next few years. Why play catchup when they don't have to? Nvidia will sell out for their golden parachutes, and will become a division of Intel, probably retaining its current brand names.

The real issue that is scary for consumers is the fate of AMD. They're getting clubbed right now by Intel, and if if Intel acquires Nvidia, it's going to get worse. If AMD isn't around in a couple years, we'll have no competition in either the GPU *or* CPU markets. Now *that's* something for all of us to be concerned about.

- woolfe

then you will a sharp increase in price and decrease in performance... followed by rebels running "illigal goods" from china and russia... those being processors that break copyright laws... and probably drm and big brother laws too...

Or none of this happens and we continue with the same companies for many years to come.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: SickBeast
Originally posted by: nitromullet
My point is simply that when these on chip/board things come out, if they are competitive performance wise they won't be cheap. The people that think that this is going to be some cheap entry into the mid range gaming market are going to be disappointed - either with performance or price. It simply doesn't make financial sense for either Intel, NVIDIA, or AMD to get rid of the lucrative mid range market segment and replace it with cheap on board gpus.
Well then I suppose the point that the rest of us are making is that consumers will pretty much save the value of the bill of materials involved with graphics cards (aside from the GPU) with the new cores like Fusion.

Really, 512MB of DDR4 and an advanced PCB can't be all that cheap. I realize that the GPU will account for at least 50% of the BOM, but there should still be a savings of at least 30% if I had to take a guess.

That's what I've been saying all along but he doesn't want to believe it.
 

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,014
136
Originally posted by: Cookie Monster
Have you ever thought about cost? Although the whole GPU socket idea isn't anything new and as been bought up even by nVIDIA, its not practical or cost effective. If it was, the industry would have made that move a long time ago.

It's not cost-effective when the GPU is grafted onto the board. It could be when the GPU is sold without a memory controller (or a simpler one than we see on cards today) or memory and pops into a CPU socket.


DDR3-1600 costs more than a 8800GTX and the prices wont plummet any time soon.

Why do you say that? I'll bet prices on DDR3 will drop faster than prices on video cards DDR3 will be produced en-masse and sold at the commodity level, and they have moved to high speeds on DDR3 VERY quickly. We're also neglecting to mention here that when you buy faster RAM, it potentially benefits the entire system, not just your graphics subsystem.

How about bandwidth comparisons? An 8600GTS has 32Gb/s. DDR2-800 can provide up to 6.4gb/s. Now since its a CPU/GPU hybrid, how are you going to share this bandwidth? What sort of trade offs are you looking at? see my point?

note - DDR3-1600 offers a peak bandwidth of 12.80 GB/s. 8800ultra has bandwidth of 102 GB/s. You do the math,

First off, you're underestimating the maximum bandwidth of DDR2-800 in a dual-channel configuration. Dual-channel DDR2-800 provides 12.8 gb/s. Dual-channel DDR3-1600 provides 25.6 gb/s. Dual-channel DDR3-2000 provides 32 gb/s. Of course, all those numbers are theoretical, but we are talking about systems that will not be limited by an FSB. It should be possible to hit theoretical bandwidth limits provided the memory controllers are up to the task.

Furthermore, while I can't predict what we'll see on CSI boards, HT boards already offer multiple banks of memory based on how many sockets there are on the board. A two-socket board with one CPU and one HT-capable GPU (assuming it has its own on-die memory controller) would double your effective memory bandwidth again (I think you'd need a NUMA-aware OS for this?). Imagine a 4GB system with two 1 GB sticks of DDR3-1600 per socket giving you 51.2 gb/s of memory bandwidth. That's quite a bit more than you get from an 8600GTS and half as much as is available to the GPU on an 8800Ultra. Hell, you can get 34 gb/s from a dual-socket board with four sticks of DDR2-1066 which would be enough for a mid-range GPU, and DDR2-1066 is already getting cheap.

It would not be difficult to feed a CSI/HT-capable GPU with enough system memory bandwidth to keep it happy. I predict they could pull the GPU off something like the 2600XT or 8600GTS, put a simple on-die memory controller on there similar to what you get with an X2 or Phenom, and sell the chip for maybe $50-$100.

Sure, you also have the cost of adding 256-512mb of memory to the system if the system's current memory level is not enough to feed both the system and the GPU simultaneously and the cost of adding a second socket to the board, but those costs are one-time for the life of the motherboard.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Azn
I never said you did. You however think what is available to you in ddr2 memory bandwidth form. Look at xbox for instance. How much do you think Microsoft actually paid to build when they were first released? Maybe $500 a little less? That's with a dvd drive, hard drive, power supply, case, cooling system, controller, cpu, and memory all in there. Imagine the price of just the motherboard? I've seen motherboards cost in the lines of $200-$250 like p35 for instance. Now x38 is it? Onboard is a lot cheaper than add in cards. It's just has always been. You are asking way too many "what if" questions just because you have doubt in your mind somehow it's going to fail miserably. If you don't think it's going to work you can always blame Intel and AMD later when they "actually FAIL". As of now you have no idea you just don't like the idea of change in the PC industry.

You dont have to put words in my mouth. I never said anything about CPU/GPU hybrid failing nor did i say anything about the whole "change" in the PC industry.

P35 that costs 200~250? I've seen decent P35 mobo from Abit IP-35 series costs around ~100 give or take 10 dollars. How about the gigabyte P35-DS3R? im not sure where you are looking at, but the market segement your looking at is the high end.

Onboard solutions are indeed cheaper. Why? its because the quality of the components often lacks compared to discrete solutions where cheaper components (and also alot simpler) are used. Current day IGPs (7050/G33) house 1~2 pixel shaders and 1 TMU and they are quite the improvement from last years.

Being innovative is one thing, but changing something that doesn't require a fix is not what i call innovative. The PC industry is alwaysbeing innovative where it makes sense thanks to some of the brilliant engineers and minds here today. (with a few minor exceptions :p)

Like i said before, CPU/GPU hybrids are for the low end where it does makes sense and will eventually take over the IGP market (depending on how the market reacts to it), and there isn't hardly any "ifs" like i was ranting about.

To me, this is going no where due to variety of reasons that im not going to state.

Sorry for going OT though. :p

note - understand that complexity always increases price due to various factors.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
basically what this is setting out to do is extend the integrated video market from "ultra low end" to reach all the way up to "medium-low" end or possibly eventually even to "medium-high" (although much less likely).
The current mid-low end solutions will be replaced with cheaper and better mid-low integrated solutions... and most non casual gamers will still buy a video card.

The only real threat is if AMD and intel will decide to make their video solutions NON compatible with each other and with other offerings... thus looking people into buying a chipset+cpu+gpu bundle from one of them with no option for third parties or mix and matching... this however is illegal, and requires the two corporations to corporate (because either one could back stab the other by secretly making their device compatible with third party hardware like nvidia gpus and then annihilate the other as everyone will buy the compatible product)

It is still years off, but will likely get there... we will most likely have something like the cell processor... with a bunch of non identical cores...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: WelshBloke
Originally posted by: nitromullet

My point is simply that when these on chip/board things come out, if they are competitive performance wise they won't be cheap. The people that think that this is going to be some cheap entry into the mid range gaming market are going to be disappointed - either with performance or price. It simply doesn't make financial sense for either Intel, NVIDIA, or AMD to get rid of the lucrative mid range market segment and replace it with cheap on board gpus.


It very much makes sense for intel to get rid of the mid range market that they earn nothing from, and where intel goes AMD has to follow (and unfortunately nvidia has no way of following)

It also makes sense for them to enter the mid range market, with a cheap to produce product, that they can use to just undercut the discrete cards. So, they sell you an on board chip for $80 that performs as well as a $100 8600GT, but it needs DDR3 RAM to do it...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Cookie Monster
Firstly, intel's lead in the total graphics market is being contended by nVIDIA. Want to see the numbers?

Nvidia is closing the gap for the number one position in graphics, AMD loses ground and Intel is flat

Growth at 82.9%. (intel at 3.1%) nVIDIA market share 32.6%. (intel at 19.5%). This is based on Q2 07.

Now tell me how is nVIDIA going to disappear within the next few years? To me thats just false. There will always be a graphics market whether you like it or not.

Some people have argued that once fusion and other CPU+GPU hybrids take off, it will spell doom for nVIDIA. Absolutely not! What people fail to realise is that while CPU/GPU hybrid offers alot of functionality for a given price due to the manufacturering cost of such chips, one has to think about the performance behind such product.

Fusion or any early CPU/GPU product isn't going to be perfoming at 2900XT levels, or even 8600GT levels or heck 8500GT levels. Why? such performance requires more complexity and i.e more transistors. Adding complexity and more transistors require more power, R&D, something to interconnect everything and need to solve problems such as "how is the bandwidth going to be shared between the GPU and CPU?" This in turn goes against the whole CPU+GPU being "cheap". What about the power? the heat? Not to mention a MCM package would not make much sense at an economical level.

(8600GT - G84 core uses 289M transistors. Compare this with Core 2 duo 291 million transistors).

Fusion and early hybrids (within the next couple of years) is simply targeting at the low end (for now) where the concept of CPU+GPU makes logical sense. Notebooks for one can benefit from this hugely. We are talking about GPU performance that is going to be a little if not similair to todays integrated GPUs such as nVIDIA's MCP73 or 7050 IGP or Intel's G33/35 or AMDs X1250 (690G). IT would literally take years or generations before a consumer level GPU/CPU hybrid could take on a high end discrete graphics card. (and at the rate GPUs are going now, i doubt this will happen within the next 5 to 10 years)

Talking about discrete graphics nVIDIA is already working on G100. AMD has its R700 project for 2008. Intel is set to release larrabee in 2009 and has already confirmed to "re-enter" the discrete GPU market.

On a side note, nVIDIA buying out VIA would be interesting. (merging would make NO sense at all since nVIDIA has the spare 2 billion dollars.cash.)
yeah, the last time a gpu company and cpu company merged the results were fantastic...

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: cmdrdredd
Originally posted by: LOUISSSSS
Originally posted by: InflatableBuddha
This is a bit of a side issue, but can Nvidia make enough profit on graphics chips for consoles to offset losses in the PC market? Even though it has to compete with AMD/Ati in that market? What about profits Nvidia makes on graphics for PDAs and mobile phones?

I know high end PC graphics cards are a niche market, but I guess my main question is, can Nvidia's other products generate enough profit to offset losses in the PC market?

don't think so, iirc, isn't the WII gpu made by ATI? and isn't wii sales > xbox & ps3?

Once people realize the Wii has no games...maybe mom will buy a different system instead.
are you serious? The Wii is kicking the crap out of xbox and ps3 in sales. Every gaming company worth a damn is making games as fast as they can for the Wii (unless prohibited by exclusivity deals on specific titles). That's why sony and m$ are willing to sell the consoles at a big loss, they need to have them out there to get more games made for them. Wii is the hot item right now and has driven nintendo up to #2 in total japanese market cap. That didn't happen because a bunch of "mom's" are going to be buying different systems for their kids.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Azn
I don't think Intel wants Nvidia. I think Nvidia is too big for Intel's good. If they did they would have picked it up a while ago. Intel purchased former Kyro and is working on their own technology.
QFT

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
yap... now they would want to make it COST a fraction of the price... that way they could make much more profit... manufacturing for 10$ instead of 70$ and selling for 80$ instead of 100$ means much more money in their hands... selling too low is bad business.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Cookie Monster
Originally posted by: Azn
I never said you did. You however think what is available to you in ddr2 memory bandwidth form. Look at xbox for instance. How much do you think Microsoft actually paid to build when they were first released? Maybe $500 a little less? That's with a dvd drive, hard drive, power supply, case, cooling system, controller, cpu, and memory all in there. Imagine the price of just the motherboard? I've seen motherboards cost in the lines of $200-$250 like p35 for instance. Now x38 is it? Onboard is a lot cheaper than add in cards. It's just has always been. You are asking way too many "what if" questions just because you have doubt in your mind somehow it's going to fail miserably. If you don't think it's going to work you can always blame Intel and AMD later when they "actually FAIL". As of now you have no idea you just don't like the idea of change in the PC industry.

You dont have to put words in my mouth. I never said anything about CPU/GPU hybrid failing nor did i say anything about the whole "change" in the PC industry.

P35 that costs 200~250? I've seen decent P35 mobo from Abit IP-35 series costs around ~100 give or take 10 dollars. How about the gigabyte P35-DS3R? im not sure where you are looking at, but the market segement your looking at is the high end.

Onboard solutions are indeed cheaper. Why? its because the quality of the components often lacks compared to discrete solutions where cheaper components (and also alot simpler) are used. Current day IGPs (7050/G33) house 1~2 pixel shaders and 1 TMU and they are quite the improvement from last years.

Being innovative is one thing, but changing something that doesn't require a fix is not what i call innovative. The PC industry is alwaysbeing innovative where it makes sense thanks to some of the brilliant engineers and minds here today. (with a few minor exceptions :p)

Like i said before, CPU/GPU hybrids are for the low end where it does makes sense and will eventually take over the IGP market (depending on how the market reacts to it), and there isn't hardly any "ifs" like i was ranting about.

To me, this is going no where due to variety of reasons that im not going to state.

Sorry for going OT though. :p

note - understand that complexity always increases price due to various factors.

Of course you never said it's going to fail quote for quote but you are doubting these hybrids before they are even here or even knowing anything about the technology. You are comparing with current onboard solutions to your $100 video card and think ALL onboard video sucks even future hybrids. When p35 first came out they cost $200-250. Even some p35 still cost that much today. Now x38 is the new technology costing that much.

You typed the words out of my mouth what I said prior about you don't like the idea of change in the PC industry. So what I said prior was on point because you some how think it's not innovative.

You don't think it needs a fix? You don't think graphic cards are too expensive? $500-$600+ for a graphic card? No way in hell I'm going to pay $500 for a graphic card. If these corps saves them money to produce a hybrid it will cost us less. 2 products cost MORE than 1 product incorporated. You can still upgrade your graphic sub-system if you'd like and stick to your $500 video cards but I'd rather pay less and get more for my money.