Fusion vs. Nvidia High End Cards

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apathy_next2

Member
Jun 15, 2010
166
0
76
We are having these conversations now in 2010, I bet Nvidia has been having them for a while now. Nvidia is branching off into many different markets. I don't think they will go away.
I wonder how the GPU+SB from Nvidia will work. I don't know if they are thinking about it, but if they can do something like what Hydra is doing now where their SB graphics can SLI with the CPU graphics to boost performance it can potentially be a decent way to branch into another market. I am not totally sure if that will be feasible, but if Hydra can do it I bet Nvidia can as well. Just think about it, you have a 5570 level graphics CPU and you SLI that with a GT210/220 GPU, you may not have 5770, but it should be damn close. And that performance only gets higher as they can integrate higher level GPUs on chips

Diversity, that is Nvidia's future and I think they are on the right track with that.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
How are they going to overcome the bandwidth limitations? On die local memory?

I wonder if they'd be able to create some kind of advanced side port memory?

I don't see Fusion challenging Nvidia anywhere other then the low end. But as someone else said, that could suck a lot of revenue dry for Nvidia.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Intel's Sandy Bridge does exist and does confirm it's powerful enough for Average Joe computer user.

Sure the same "Average Joe" computer user that is already using Intel GMA. I have not seen any benchmarks that beat a $40 GT240. So for now NVIDIA's low end discrete market is safe and sound.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
We are having these conversations now in 2010, I bet Nvidia has been having them for a while now. Nvidia is branching off into many different markets. I don't think they will go away.
I wonder how the GPU+SB from Nvidia will work. I don't know if they are thinking about it, but if they can do something like what Hydra is doing now where their SB graphics can SLI with the CPU graphics to boost performance it can potentially be a decent way to branch into another market. I am not totally sure if that will be feasible, but if Hydra can do it I bet Nvidia can as well. Just think about it, you have a 5570 level graphics CPU and you SLI that with a GT210/220 GPU, you may not have 5770, but it should be damn close. And that performance only gets higher as they can integrate higher level GPUs on chips

Diversity, that is Nvidia's future and I think they are on the right track with that.

Don't be fooled. Nvidia is crapping cinder blocks trying to find a new bussiness model. Niche gaming might be it along with other applications like video rendering etc.

They don't really have a choice.
 

apathy_next2

Member
Jun 15, 2010
166
0
76
People who buy GMA4500 chips , or amd 785g these days will be the one buying fusion and sandy bridge. Nvidia cards have usually always been add on as an upgrade have they not?

I am trying to say that Nvidia who never really had that big on a integrated graphics market share won't be that much hurt.

Sure the future on chip graphics will be more powerful and capable of running more games, but its all relative, there will be games out by then that run like shit on those chips. just like there are games out now that run like shit on integrated graphics, but you can always play older games and that will be true for the future.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Sure the same "Average Joe" computer user that is already using Intel GMA. I have not seen any benchmarks that beat a $40 GT240. So for now NVIDIA's low end discrete market is safe and sound.

If OEMs can get by buying these hybrids from AMD & Intel in bulk they won't need to buy cards from Nvidia for their low to midrange off the shelf preconfigured systems. This is going to hurt Nvidia's income at some point. That's not even taking into account future revisions from AMD & Intel that will only get better. Just look at the performance jumps Intel has made.

FYI the new models are not called the Intel GMA anymore.
 
Last edited:

solofly

Banned
May 25, 2003
1,421
0
0
First of all AMD needs to learn how to write drivers and forget about these flashy names they come up with...lol
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
First of all AMD needs learn how to write drivers and forget about these flashy names they come up with...lol


Actually game engine need to stop molesting the API for useles crap and we wouldn't need constant drive mutations.

DX11 is bring 99% of those outboard features and AMD support DX11 just fine.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
If OEMs can get buy buying these hybrids from AMD & Intel in bulk they won't need to buy cards from Nvidia for their low to midrange off the shelf preconfigured systems. This is going to hurt Nvidia's income at some point. That's not even taking into account future revisions from AMD & Intel that will only get better. Just look at the performance jumps Intel has made.

FYI the new models are not called the Intel GMA anymore.


trust me Nvidia is patently worried about this. They sell alot of integrated or laptop graphic products to. They are getting locked out here and I would wager thats a good 30-45% of there gross revenues. They really have 2 options. sell the IP and move to other markets or start making CPU's and competing.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
I dont think fusion is going to be that powerful, just think for a minute the size of heatsink you are going to need to cool a couple of 5970's and a CPU all in the same package, even if they manage to reduce the power 20-30% and retain the same performance thats still going to be over 400 watts to cool, good luck pulling that off on air

Well, Fusion is/was never meant to replace the top of the line cards. It'll be the low end that is impacted. Go look at Anand's Sandy Bridge preview to get an idea of what will be impacted. The low end discrete video card market looks like it's dead in two year's time. I don't think anyone with a Sandy Bridge is going to be pairing it with a $100'ish video card at all. Fusion should provide similar benefits.

So the mid to high end discrete video cards will always be there. It's just that the low end is effectively dead. Don't forget that the majority of laptops use a low end graphics chipset. There will be almost no reason to buy nVidia or AMD discrete graphics chipsets for laptops.

Sounds like wishful thinking. Honestly. 5770 performance a year from now? That is a 40 dollar card then. High end discrete graphics wont be the territory of Fusion. If it ever comes to that, consoles will have won the war.

I think the problem with this thinking is that there is a point of diminishing return. A Radeon 5770 level performance will satisfy the majority of computer users who are more casual gamers. Keep in mind that the most of Anandtech readers are more tech oriented. We are in the minority as far as overall computer market goes even if we make up the majority of discrete video card buyers.

could be even higher given the 6770 results. I doubt the first generation fusion chips will get the 6770 stuff but no reason the bulldozer APU will not considering its do out late 2011.

Likely won't be based on the 6xxx series. I'd assume you'd want a stable GPU to integrate into your CPU design. Hence, Fusion products will likely be one gen behind. Of course, these are merely my ASSumptions since I know nothing of chip design.

trust me Nvidia is patently worried about this. They sell alot of integrated or laptop graphic products to. They are getting locked out here and I would wager thats a good 30-45% of there gross revenues. They really have 2 options. sell the IP and move to other markets or start making CPU's and competing.

Absolutely. Why do you think nVidia is moving to Tegra? The ARM based market is plenty crowded and nVidia actually faces fierce competition with a lot of GPU vendors in the embedded space. I do think that nVidia's experience will with GPU's will help Tegra but it's going to take a while.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
One day, CPU will be within my watch and takes electricity off my hand, which is coming from excessive fat. The display will either be on my glasses or on the surface of my eye. I/O will be jacked directly into my backbone. To overclock, I simply cut off my arms and legs for more electricity, it isn't like I need them then.

Fusion will bring us there.

Until then, Nvidia will create processors that fits perfectly inside a human skull, i/o directly from neurons, and energy from blood. Best of all, it comes with a flesh too.

Problem is, we no longer have options to what we eat or drink because Intel is the only vendor for food and Mac sells water.
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
One day, CPU will be within my watch and takes electricity off my hand, which is coming from excessive fat. The display will either be on my glasses or on the surface of my eye. I/O will be jacked directly into my backbone. To overclock, I simply cut off my arms and legs for more electricity, it isn't like I need them then.

Fusion will bring us there.


Overclocking requires oral stimulation ?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Overclocking requires oral stimulation ?
Yes, or injection.

All programs will become the sub program to a bigger program called "Reality" and we no longer use hz to measure speed of the processor, we use something called time.

Like all other inventions, there are expiry dates on each individual units. So each unit must make well use of the "Reality" before the expiry date.

Yeah i forgot, I don't know where exactly the expiry date is printed.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
trust me Nvidia is patently worried about this. They sell alot of integrated or laptop graphic products to. They are getting locked out here and I would wager thats a good 30-45% of there gross revenues. They really have 2 options. sell the IP and move to other markets or start making CPU's and competing.

Unless PC architechture changes a lot, even DDR3-2000 with fairly low latency is pathetically slow compared to the ram on even a 5770, and the 5770 doesn't have to share that ram with anything else. Also think of the heat put out by the 5770 core. Even with a die shrink, I think it's EXTREMELY ambitious to believe that Fusion will be competitive with a full blown 5770. To add to that, in a year's time 5770 will be almost irrelevant for anything more than very casual gaming, new titles will continue to push the 5770 out of the picture as even a passable gaming card. That brings Fusion back to the same kind of casual use that stuff like the onboard Intel HD (previously GMA), Nvidia 7150, ATI onboard HD4200, etc sees today.

What looks to be going away is Nvidia selling chipsets. That's already pretty much a done deal.

As for low to midrange desktops, how often do you even see them offer a GT220/5450/etc as standard equipment? Personally I almost NEVER see those types of cards offered. Why?

(1)- They're not any better than modern onboard for stuff like MS Office, BluRay, Facebook, etc.

(2)- They're not acceptable even for very casual gaming, unless you're playing ancient games. Even WoW (bleh) will choke on these cheap cards with anything but terrible detail settings.

What is going to continue to challenge Nvidia is ATI's discrete products. Nvidia will lose the rest of their onboard chipset crap, but that's a done deal anyway. But neither SB nor Fusion will make any more than a tiny dent in discrete sales, they'll just replace chipset-based video with gpu-resident video. Gamers, even pretty casual ones, who might toss 'Rage' or 'Crysis 2' into their bin at BB, will not see playable results from these cheap integrated solutions any more than HD4200 or Core i3 GPU wil give. 19fps instead of 7fps is not a win here. Gamers are a moving target, and we've heard many many many years of promises of how onboard video was going to finally be playable for games. Has it ever been true? Yes, if you freeze time. You can use Intel HD to play games from 2004 pretty well can't you? Can ATI HD4200 run Crysis? Can it even run something more pedestrian, say Fallout 3 on an old engine? Hah.

This conjecture over some onboard stuff from Intel or ATI making more than a superficial change is just mental masturbation. If Nvidia dies, it will be because they were beaten at the discrete game, and they have been for most of recent history already. But look back at the past decade or so, and think of the most efficient bang/buck GPUs of each ~2 year era (Think ATI 9500Pro, Nvidia Ti4200, 8800GT, X1800GTO/etc) and what kind of memory they used (clock speed/bit width), how much heat they dissapated, along with die size. Now imagine for any system from that same era to have that GPU slammed into the CPU die, and the memory to all have to come from main system ram. Feasible? I think not.

I think Fusion will eventually come out, it will be decent, and it will make for a more elegant solution than chipset-resident video for very casual use. But it'll never overlap with discrete sales, aside from the random poor saps who might plug a GF7300GS into a system that already had onboard ATI HD4200 :p
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What is going to continue to challenge Nvidia is ATI's discrete products. Nvidia will lose the rest of their onboard chipset crap, but that's a done deal anyway. But neither SB nor Fusion will make any more than a tiny dent in discrete sales, they'll just replace chipset-based video with gpu-resident video.

That's my opinion as well. SB 12 unit GPU is only as fast as G310M/5450. GT415M (now the slowest discrete mobile GPU from NV) will be almost 2x faster than SB graphics. And SB hasn't even launched yet. I think AMD needs to design a better CPU in the first place before anyone will even consider a laptop with AMD processor in it :) By the time Fusion ships next year, games will be even more advanced. Consumers who buy discrete graphics in the first place aren't going to seriously consider SB as a "gaming alternative". I am sure Fusion will be more impressive by 2012-2013, but Intel gaming graphics is an oxymoron.
 
Last edited:

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Nvidia already has their market, the professional users. ATI barely comes close to nvidia when it comes to workstation users. If they stay with that and bring chips to mobile markets then they can do very well. Don't forget that before all the 3d pc gaming became popular there were companies like 3dlabs that made good profits off pro card sales. I remember the wildcat cards that made a quadros price look like loose change. AMD and fusion are what MS wants for windows 8. The slides being shown in meetings for windows 8 design clearly show that what MS wants is an iMac clone for windows. Where the PC , screen and everything the user needs is all inside one unit unified together behind a touch screen interface. AMD fusion fits that market perfectly.


Windows-8-Machine.png
 
Last edited:

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
The problem is, without the heavy revenue the low end retail cards bring in NVDA may struggle to fund the very expensive research and development needed to compete with the high end AMD cards.
Class leading chips don't come easily or quickly and they are notorious sinkwells for capital.
If the new AMD/Intel CPUGPU chips take that income stream away NVDA will have to fund its research from its new market customers.
They also have to consider a better funded AMD with GLoFo up and running entering these new markets as well.
 

dzoner

Banned
Feb 21, 2010
114
0
0
I would never buy a CPU/APU/MOBO/whatever that by any means pigeon-holes me into buying their discrete cards to keep from having a crippled system.

"non-optimized' ... o_O

lol. it' still an unknown. i was speculating.

i know the bobcat cpu will have extensive power optimization features, including pretty fine grained power gating. no mention was made of what power optimizations were being applied to bobcat's gpu core, probably withheld as a little surprise for nvidia with the 6xxx cards, but the trend is crystal clear. 2nd gen fusion desktop chips can be expected to have extremely fine power gating and highly optimized power efficiencies in general. a quad core + 800 (equivalent) shader gpu core @ 3.0 ghz+ should run quite cool 95% of the time, even at 32nm. At 22nm 6 cores + 1600 shaders. That will be available for Xmas 2014 shopping.

Intel isn't going to be that far behind. when the bemouths get SERIOUS about something, things start happening, and they have accumlated several years of hard won knowledge with larrabee. and there's that CONTINUING to learn and grow their graphics division.

a 2nd generation desktop fusion chip will use just as much power as the application(s) need to run, including games, and also including then current generation AMD AIBs. Add in am SSD and an 80+ gold ps and steadily rising electricity costs, this is not happening in a vacuum, and that fusion system will be providing substantial cost savings month on month.

it's all about surfing the web on 10w system total, then fire up a game and the system uses precisely the amount of power needed to play that game moment by moment.

So instead of that gaming system you have now sucking down 120 or 180 watts as you surf the web, or post to forums or go get something to eat or like me, runs 24 hours a day because it's your dvr, you'll have a system that idles at 10 watts and is capable of running the latest computer games at high settings.
 
Last edited:

dzoner

Banned
Feb 21, 2010
114
0
0
To add to that, in a year's time 5770 will be almost irrelevant for anything more than very casual gaming, new titles will continue to push the 5770 out of the picture as even a passable gaming card. That brings Fusion back to the same kind of casual use that stuff like the onboard Intel HD (previously GMA), Nvidia 7150, ATI onboard HD4200, etc sees today. :p

Interesting world view.

Name a console derived game I can't handily play on moderate to high settings on my 1080p tv.

Next year's console derived games will require far higher computer graphics because ... ?

ps. check out the latest Steam survey to see how the 57xx cards fare when actual COMPUTER gamers are putting down their money to buy a new graphics card.
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
They will be fine provided they find high end buyers for HPC cards.

Otherwise there will be no proverbial cash cow to warrant high R&D costs... but that is just my opinion.. It may be perfectly possible for a company to survive on high end (roughly) cards only. They do have tegra as well.

Yeah, anyone thinks Quadros can finance required R&D costs is smoking something highly illegal stuff. I don't but I also don't think they will be able to create a new, highly profitable niche CUDA/GPGPU market to sustain their product development cycles.

Ahh and the Tegra-series is an utter and complete disaster so far.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I dont expect Fusion to kill anything but the sub 100 market. But thinking about it more questioning whether it will actually do that. And Tegra should start picking up steam by making it into tablets and HTC and Motorola phones.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
And Tegra should start picking up steam by making it into tablets and HTC and Motorola phones.

The big problem for Nvidia with Tegra is that no one is using it. Marvel just released a triple core ARM cpu and Snap Dragon is probably the fastest. Tegra has to compeate with these products.

Add into that Intel shrinking SB based Atoms down. So while ARM CPUs are trying to move up Intel is trying to move down with Atom.