trust me Nvidia is patently worried about this. They sell alot of integrated or laptop graphic products to. They are getting locked out here and I would wager thats a good 30-45% of there gross revenues. They really have 2 options. sell the IP and move to other markets or start making CPU's and competing.
Unless PC architechture changes a lot, even DDR3-2000 with fairly low latency is pathetically slow compared to the ram on even a 5770, and the 5770 doesn't have to share that ram with anything else. Also think of the heat put out by the 5770 core. Even with a die shrink, I think it's EXTREMELY ambitious to believe that Fusion will be competitive with a full blown 5770. To add to that, in a year's time 5770 will be almost irrelevant for anything more than very casual gaming, new titles will continue to push the 5770 out of the picture as even a passable gaming card. That brings Fusion back to the same kind of casual use that stuff like the onboard Intel HD (previously GMA), Nvidia 7150, ATI onboard HD4200, etc sees today.
What looks to be going away is Nvidia selling chipsets. That's already pretty much a done deal.
As for low to midrange desktops, how often do you even see them offer a GT220/5450/etc as standard equipment? Personally I almost NEVER see those types of cards offered. Why?
(1)- They're not any better than modern onboard for stuff like MS Office, BluRay, Facebook, etc.
(2)- They're not acceptable even for very casual gaming, unless you're playing ancient games. Even WoW (bleh) will choke on these cheap cards with anything but terrible detail settings.
What is going to continue to challenge Nvidia is ATI's discrete products. Nvidia will lose the rest of their onboard chipset crap, but that's a done deal anyway. But neither SB nor Fusion will make any more than a tiny dent in discrete sales, they'll just replace chipset-based video with gpu-resident video. Gamers, even pretty casual ones, who might toss 'Rage' or 'Crysis 2' into their bin at BB, will not see playable results from these cheap integrated solutions any more than HD4200 or Core i3 GPU wil give. 19fps instead of 7fps is not a win here. Gamers are a moving target, and we've heard many many many years of promises of how onboard video was going to finally be playable for games. Has it ever been true? Yes, if you freeze time. You can use Intel HD to play games from 2004 pretty well can't you? Can ATI HD4200 run Crysis? Can it even run something more pedestrian, say Fallout 3 on an old engine? Hah.
This conjecture over some onboard stuff from Intel or ATI making more than a superficial change is just mental masturbation. If Nvidia dies, it will be because they were beaten at the discrete game, and they have been for most of recent history already. But look back at the past decade or so, and think of the most efficient bang/buck GPUs of each ~2 year era (Think ATI 9500Pro, Nvidia Ti4200, 8800GT, X1800GTO/etc) and what kind of memory they used (clock speed/bit width), how much heat they dissapated, along with die size. Now imagine for any system from that same era to have that GPU slammed into the CPU die, and the memory to all have to come from main system ram. Feasible? I think not.
I think Fusion will eventually come out, it will be decent, and it will make for a more elegant solution than chipset-resident video for very casual use. But it'll never overlap with discrete sales, aside from the random poor saps who might plug a GF7300GS into a system that already had onboard ATI HD4200
