Fusion vs. Nvidia High End Cards

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

T2k

Golden Member
Feb 24, 2004
1,665
5
81
The big problem for Nvidia with Tegra is that no one is using it. Marvel just released a triple core ARM cpu and Snap Dragon is probably the fastest. Tegra has to compeate with these products.

Add into that Intel shrinking SB based Atoms down. So while ARM CPUs are trying to move up Intel is trying to move down with Atom.

ANd let's not forget the main reason Tegar sucks so much most system integrators dropped it: it kills the battery.
Yeah, those pesky mobile system designers dared to actually test it and found out NV was once again talking BS about energy consumption...
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I won't be remotely surprised if Fusion can hybrid Crossfire with AMD discrete GPUs in the future. 400 SPs on the CPU working with extra ~1k+ SPs on the GPU looks tasty. That's one less reason to buy Nvidia.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The big problem for Nvidia with Tegra is that no one is using it. Marvel just released a triple core ARM cpu and Snap Dragon is probably the fastest. Tegra has to compeate with these products.

Add into that Intel shrinking SB based Atoms down. So while ARM CPUs are trying to move up Intel is trying to move down with Atom.

Motorola is going to use it in their droid 2 phone. And tablet manufacturers are picking it up as an alternative to atom.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Motorola is going to use it in their droid 2 phone. And tablet manufacturers are picking it up as an alternative to atom.

I've seen Tegra used in a few prototype tablets & dumb tablets. But I have yet to see any of those make it to market. Other devices/companies that talked about using Tegra dropped them in favor of other chips.

This is probably why Nvidia was talking about Tegra 3 & 4 all of a sudden. I don't think Tegra is horrible and I liked what it was able to do in the Zune. But Nvidia needs to make Tegra a lot better by the 3rd gen if they want to be competitive in the smart phone market. Using Glofo for Tegra is probably a good start.

Personally I'm more interested to see what Marvel is able to do with that tri-core ARM. Then there is also Intel. Are they crazy enough to get an Atom into a smart phone or just buying ARM out as a company.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Uhhh... forgeting bobcat much?

Is it out yet? Are there any sales for bobcat/ontario/bulldozer to gauge that they sold well? When AMD delivers, when we see the prices, when we see the performance and when sales show for it, then we can revisit.

For example, Llano is already rumored to be delayed until Q3 2011. Intel will have even faster processors/APUs by then. Similarly, both AMD and NV will have even faster graphics cards by 2011. There is so much hype for APUs, it's not even funny. I haven't seen this much hype for any product I can think of. All Intel and AMD did was take advantage of the more advanced manufacturing process to integrated the GPU into the CPU. The end result is taking a horrible GPU from the chipset and integrating it into the GPU, concurrently increasing its performance to mediocre (which would have happened on the chipset anyways due to advances in engineering and technology) - and suddenly everyone and their grandmother is touting the end of the discrete GPU (and esp. from NV as if AMD is immune and will continue selling $50-600 GPUs).

The way I see it is we had slow chipset graphics + CPU and now we will have slightly slow graphics inside the CPU. Nothing has really changed except increased performance.

I don't understand the hype behind APUs. In other words, if chipset graphics increased their performance 2x every 12 months, would we still say that discrete graphics is dead?? Of course not. People are forgetting next year discrete graphics will increase performance again, where a 400 SPs HD5000 core will be nothing special when constrained by lack of memory bandwidth.

My personal view is this:

The real reasons discrete GPUs are on the ropes are due to these factors:

1) Consoles are taking over and less and less people are gaming on the desktop, instead choosing to use PS2/PS3/360/Wii.

2) Other portable devices satisfy the cravings for occassional gamers such as the Ipad, Smartphones, handhelds like Nintendo DSi. With people's busy lives, they might be satisfied to play a 30 min of simple and fun 2D/3D games on the go.

3) PC gaming sales are down this year, and with constant consoles ports in the last 3-4 years, there are hardly any killer PC games you can't play on a console other than say Crysis, WOW and Starcraft 2. Sure the graphics are better on PC, but most people don't care about that - not enough to spend $200+ on a graphics card when a console is only $200-300 that they don't have to upgrade for 6-7 years. Bottom line is, desktop PC gaming is more expensive and it doesn't have enough game exclusives to justify upgrades every 2-3 years for the majority of consumers.

4) For most games today the only difference between a $200-600 graphics card is amount of AA and ability to play at 2560x1600 - that's it. There is no way you could have played most games maxed out at 1920x1080 on a $200 graphics card 4-5 years ago. Today a $200 GTX460 1GB will do so without breaking a sweat. Clearly hardware advancements have vastly outpaced software advances in the last 3-4 years in PC gaming.

5) Job and economic uncertainties have contributed to consumers' relucantcy to buy luxury items (i.e., graphics cards). It is no wonder the <$200 market of graphics gets about 75&#37; of all the sales volume for discrete chips.

6) Most consumers now purchase laptops/netbooks over desktops where discrete graphics is largely an afterthought for their needs.

If we look at sales of discrete graphics, they were shrunk almost 40-50% way before the APU was even in the picture because of these 6 factors above. How come all these "discrete GPU" bashing comments/articles never mentioned the true reasons for the decline of the discrete GPU that has been happening for a long time now? :colbert:
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Interesting world view.

Name a console derived game I can't handily play on moderate to high settings on my 1080p tv.

Next year's console derived games will require far higher computer graphics because ... ?

ps. check out the latest Steam survey to see how the 57xx cards fare when actual COMPUTER gamers are putting down their money to buy a new graphics card.

You need to understand that console =! PC in terms of specs. Console ports can be some of the worst offenders in terms of requiring quite a lot of graphics horsepower to play, which kind of sucks considering that the PS3 and X360 GPU designs are circa ~2006, but the optimizations for their specialized hardware and low OS/API overhead bring pretty good results. There are also some tricks in play, as "1080p" on many/most console games is really a huge lie.

A good thread on that :

http://www.avsforum.com/avs-vb/showthread.php?t=1090213

One big problem on PC is that trying to run games in non-native resolutions for your LCD can be pretty ugly. And games coming out seem to have pretty significant hardware requirements.

http://game-debate.com/games/index.php?g_id=950&game=FIFA 2011

That ^^ requires a 3870/C2D/2GB as minimum system requirements. That probably nets you chuggy performance at 1280x1024 or so with no AA/AF/etc. My 5770, PhII X4, 4GB, and 1920x1200 LCD would probably be able to run it with moderate settings at 1920x1200. Other games either already out (Metro 2033), or coming soon (Crysis 2, Rage, etc) are either proven to be or will likely be even more difficult to keep smooth without 5850/GTX460 level GPUs at 1080p and above with good detail.

Are PC games going to become MORE or LESS power hungry over the next year and beyond? I'm already resigned that I'll probably have to ditch the 5770 if I want more than a slideshow at good settings in titles like Rage or Crysis 2.

This all comes back to : it's spectacularly unlikely that until 22nm or so comes about along with vastly increased memory performance, that a CPU+GPU combo came out that could meaningfully match a full blown 5770 in real world testing. The memory bandwidth bottleneck and thermal output doesn't add up to 5770-level performance for quite some time, and certainly not on a 45nm, 40nm, or probably even 32nm process tech.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
A year from now AMD will be prepping it's bulldozer/6000 based fusion chips. .

Bulldozer is not a fusion chip. Llano is the highest end fusion chip being released by AMD next year. A Bulldozer based fusion chip is not yet on the roadmap for AMD, but may come in 2012.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Then there is also Intel. Are they crazy enough to get an Atom into a smart phone or just buying ARM out as a company.

That would be quite surprising considering Intel was the major player in ARM world with StrongArm (ex-DEC) and later XScale for several years until they spun/sold it to Marvell... ;)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
http://game-debate.com/games/index.php?g_id=950&game=FIFA 2011

That ^^ requires a 3870/C2D/2GB as minimum system requirements.

* System requirements displayed are based on recommended system requirements and should be used as a guide only.

I think you are too pessimistic if you think you'll have a slideshow on a 5770 in Crysis 2 or Rage. Rage won't be demanding imo and Crysis 2 should run faster than Crysis 1 according to the developer. In addition, even with Crysis 2 on the lowest settings, it will look better than any console game I bet. ;)
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Lol ... I'll grant you they sell more, I'm not betting my farm on that, i was pointing out the 'obviously Intel will sell *droves* more' might not apply in the future.

LOL, are you serious???
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't know. 15 years ago CPU does everything, soon CPU does everything....

Although it is natural to think that it will rock because the interface distances is much shorter, but is it really better? For CPU/GPU combo to work, almost all assembly code needs to be rewritten, and the OS must be modified for it. Otherwise, it is just a half sized CPU + a half sized GPU without onboard memory.

LOL, isn't this awesome? I made the same comment a few days ago that we are returning to the computing world of 1992-1994 where the CPU does do everything. It will probably end-up the same, for a while, except for high-end stuff. Hopefully the pro graphics and high-end are enough to keep R&D flowing for these gpus. Time will tell.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
* System requirements displayed are based on recommended system requirements and should be used as a guide only.

I think you are too pessimistic if you think you'll have a slideshow on a 5770 in Crysis 2 or Rage. Rage won't be demanding imo and Crysis 2 should run faster than Crysis 1 according to the developer. In addition, even with Crysis 2 on the lowest settings, it will look better than any console game I bet. ;)

Ah, I misread that link. Going by typical (GTA4) console port performance, it looks to match what I would consider minimal.

My 5770 already chugs at 1920x1200 in many games, I have to tone down the effects quite a lot, particularly in Metro 2033, GTA4, Mafia 2. Expecting good performance (~60fps constant) at 1080p or 1920x1200 in upcoming titles probably means something better than my 5770, unless I just decide to live with very low details. I wouldn't expect a Fusion variant to match the performance of a real 5770 for at least 2 years (waiting for process tech to come online), their current efforts look pretty crap. Did you see the Fusion AVP DX11 demo? Low res textures, no enemies, and what looked like total ass for framerate, all looking to run on something like 1280x720. It looked like little more than IGP performance, and what one would expect a dirt-cheap 5450 DDR2 card to push, if that. It's just a tech demo, so we shouldn't judge it too harshly, but logically the current limits of process tech don't leave a lot of room to fit a worthy GPU onto a CPU die and be thermally managable.

I think Fusion will be a great step towards simplifying mobos, and along with Intel's on-die GPUs it will eliminate the extreme low-end/obsolete GPUs and more immediately the chipset-based IGPs. Other than that, I don't think it's going to be able to make a dent in discrete cards for gaming due to it being a moving target. By the time a Fusion comes to market that does equal the 5770, what games will we be running? What cards will be available for $100, $150, $200, etc? Will it be competing against the discrete cards of the ATI 7870 and GTX680 generations? If a $119 ATI HD7750 is 5 times faster than an ancient 5770 and the game you want to play in summer 2013 won't run worth a damn on it, how relevant is that GPU for anything other than facebook, classic/entry level gaming, and hd video?
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
That would be quite surprising considering Intel was the major player in ARM world with StrongArm (ex-DEC) and later XScale for several years until they spun/sold it to Marvell... ;)

Yeah I see AMD selling out to Qualcomm as a very similar result despite the very different reasons for the sale. As this cut both of them from making some nice cash in the booming smart phone market.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
and suddenly everyone and their grandmother is touting the end of the discrete GPU

You also need to stop reading into people's posts so much. I don't think anyone here has said anything as foolish as the end of the discrete GPU in general.

Part of the hype over APUs is the gains in GPU acceleration. Just wait until software vendors are able to take advantage of APUs & discrete GPUs. Also a big problem with GPGPU is the latency in the PCIe system bus. It will take a couple of generations but around the 3rd gen we should be seeing some very nice GPU cores in the APUs. At least from AMD when they get Bulldozer going with at least a 7000 series based GPU core.

Just try and imagine a hex or even octo core hyper threaded CPU with a powerful GPU core capable of OpenCL & Direct Compute. Apple already has OpenCL included in their OS and Microsoft has multiple open APIs to use the GPU. Yes it will take another 2-4 years but they're coming.

I also see this as the final death of having ancient IGPs in systems. Especially on Intel's end where the same GMA chip was used for 2 years because Intel made millions of them. Upgrading the CPU in people's systems also means a new IGP. Especially in AMD systems where people can use the same mobo for years.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Ah, I misread that link. Going by typical (GTA4) console port performance, it looks to match what I would consider minimal.

My 5770 already chugs at 1920x1200 in many games, I have to tone down the effects quite a lot, particularly in Metro 2033, GTA4, Mafia 2. Expecting good performance (~60fps constant) at 1080p or 1920x1200 in upcoming titles probably means something better than my 5770, unless I just decide to live with very low details. I wouldn't expect a Fusion variant to match the performance of a real 5770 for at least 2 years (waiting for process tech to come online), their current efforts look pretty crap. Did you see the Fusion AVP DX11 demo? Low res textures, no enemies, and what looked like total ass for framerate, all looking to run on something like 1280x720. It looked like little more than IGP performance, and what one would expect a dirt-cheap 5450 DDR2 card to push, if that. It's just a tech demo, so we shouldn't judge it too harshly, but logically the current limits of process tech don't leave a lot of room to fit a worthy GPU onto a CPU die and be thermally managable.

I think Fusion will be a great step towards simplifying mobos, and along with Intel's on-die GPUs it will eliminate the extreme low-end/obsolete GPUs and more immediately the chipset-based IGPs. Other than that, I don't think it's going to be able to make a dent in discrete cards for gaming due to it being a moving target. By the time a Fusion comes to market that does equal the 5770, what games will we be running? What cards will be available for $100, $150, $200, etc? Will it be competing against the discrete cards of the ATI 7870 and GTX680 generations? If a $119 ATI HD7750 is 5 times faster than an ancient 5770 and the game you want to play in summer 2013 won't run worth a damn on it, how relevant is that GPU for anything other than facebook, classic/entry level gaming, and hd video?

I think what you're missing is that we're getting more and more console ports. Their aim for a satisfying experience is sub 1280x720 resolution, no AA, no AF, and 30FPS. So anything above that is just gravy for us.

When most people I know who console/PC game don't seem to press as much on upgrading their PC hardware anymore it worries me this is the trends that will start to spread. The few of us that care about IQ will slowly decline since the developers won't give a crap shoot to add extra eye candy to our games. We'll be throwing AA/AF/resolution at our ports and I doubt they will make the game require super amounts of horsepower.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Fusion may or may not seriously threaten Nvidia high end cards. By the time Fusion is out and mainstream with it's 5770 level graphics, both AMD and Nvidia discreet cards will be a generation or two past the 5XXX series. The real question is how will the APU work with the discreet GPU?

First Scenario - If you add a discreet GPU and it disables the CPU graphics core (or "bypasses" it), then AMD and Nvidia are still on equal ground. Their is no change from the current situation.

Second scenario - If you add a discreet AMD and the discreet and APU create a hybrid crossfire arrangement, and the APU actually provides benefit, then their is an issue. If the AMD APU plays equally nice with either a AMD or Nvidia GPU, then they are equal. I doubt that will happen. I bet the APU plays much nicer with AMD GPU's than Nvidia GPU's. I'm willing to bet the work AMD has been doing with hybrid crossfire has been aimed at this point. Their are two big caveats to this scenario though, the APU must provide real advantage and AMD's drivers must work flawlessly. I don't think either of those things will occur.

Personally I prefer scenario 1 just because I hear of enough problems with crossfire, I don't want it crammed down my throat. To many problems as far as I'm concerned.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
I think what you're missing is that we're getting more and more console ports. Their aim for a satisfying experience is sub 1280x720 resolution, no AA, no AF, and 30FPS. So anything above that is just gravy for us.

When most people I know who console/PC game don't seem to press as much on upgrading their PC hardware anymore it worries me this is the trends that will start to spread. The few of us that care about IQ will slowly decline since the developers won't give a crap shoot to add extra eye candy to our games. We'll be throwing AA/AF/resolution at our ports and I doubt they will make the game require super amounts of horsepower.

I hope not.

1280x720 upscaled to an artificial 1080p on a TV looks much better than running 1280x720 on a native 1080p LCD PC Display. Add to that the consoles *usually* do pretty well with consistent framerates, while a 30fps average framerate on PC can be a nightmare thanks to minimums that might drop into the teens, and I don't think all that many people will be on board for that experience. Either more people will leave PC gaming and just use consoles, or they will demand better results which require better hardware.

Next-gen Xbox and PS are also on the way albeit at least a couple of years off, and I'm sure a big selling point will be true HD gaming along with much better graphical detail. That will raise the standard considerably, and weak ports based on low resolution textures won't cut the mustard.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Fusion may or may not seriously threaten Nvidia high end cards. By the time Fusion is out and mainstream with it's 5770 level graphics, both AMD and Nvidia discreet cards will be a generation or two past the 5XXX series. The real question is how will the APU work with the discreet GPU?

First Scenario - If you add a discreet GPU and it disables the CPU graphics core (or "bypasses" it), then AMD and Nvidia are still on equal ground. Their is no change from the current situation.

Second scenario - If you add a discreet AMD and the discreet and APU create a hybrid crossfire arrangement, and the APU actually provides benefit, then their is an issue. If the AMD APU plays equally nice with either a AMD or Nvidia GPU, then they are equal. I doubt that will happen. I bet the APU plays much nicer with AMD GPU's than Nvidia GPU's. I'm willing to bet the work AMD has been doing with hybrid crossfire has been aimed at this point. Their are two big caveats to this scenario though, the APU must provide real advantage and AMD's drivers must work flawlessly. I don't think either of those things will occur.

Personally I prefer scenario 1 just because I hear of enough problems with crossfire, I don't want it crammed down my throat. To many problems as far as I'm concerned.

Very interesting points. Syncing that kind of performance is going to be really tricky though. I suppose anything is possible. As yet, with the conspicious lack of even multithreaded GPU drivers (to help balance load with multicore CPUs working with DX), it would seem getting disparate GPU generations working together to aid in a meaningful way is sort of a pipe dream. Perhaps if DX12 or so allows for GPUs to aid in non-standard ways (a la Cuda using GPU for non-graphical work) then it could provide benefit without needing to be as tightly tied together.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
IMO Fusion will be more efficient from a performance per transistor standpoint. It will also probably have excellent AA performance, seeing as it can interface so seamlessly with the CPU and the memory.

Despite all this, it will take pretty much the same number of transistors to do what we need these parts to do. Whether those transistors are on a GPU, a CPU, or an APU, it doesn't matter too much, and this is why IMO NV still has several years left at the least.

Fusion is nice because it reduces the number of components needed and it will allow users to upgrade their graphics memory. I think it has a number of advantages which will make it a huge sales hit. I think Fusion will take over the midrange within a few years due to the advantages I mentioned, provided that AMD comes out with something moderately powerful. Given the choice, I think most people would take a GPU that had upgradeable memory over one that didn't.
 

dzoner

Banned
Feb 21, 2010
114
0
0
Bulldozer is not a fusion chip. Llano is the highest end fusion chip being released by AMD next year. A Bulldozer based fusion chip is not yet on the roadmap for AMD, but may come in 2012.

'Prepping' a bad word choice, perhaps? By 'prepping' I meantAMD will be working on the bulldozer/6-7xxx chip for 2012 release.

Llano will already have been released by the end of 2011.

A bulldozer fusion chip is on the roadmap for AMD. The llano core was strictly a transition core.