Do Kepler and Maxwell have an AIB future?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Can you play current DX 9 games at high detail using discrete cards from 2004?
My point is games will scale down, you don't have to have tessellation and other settings cranked up to max. The statement, "Fusion APU will not be able to handle them" is a blanket statement and not factual. What Fusion WILL bring to the table is the ability to play current titles at respectable settings and frame rate considering the power envelope and price point, something we have not seen before.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
My point is games will scale down, you don't have to have tessellation and other settings cranked up to max. The statement, "Fusion APU will not be able to handle them" is a blanket statement and not factual. What Fusion WILL bring to the table is the ability to play current titles at respectable settings and frame rate considering the power envelope and price point, something we have not seen before.

You can't argue with what you said, because it is head-hurtingly vague.

I bolded the weasel words/statements.


Nobody truly believes that discreet GPUs are going away for a SoC solution. Not AMD, not nV....not Intel.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Nobody truly believes that discreet GPUs are going away for a SoC solution. Not AMD, not nV....not Intel.
No one with any sense said that either. Well maybe you did I'm not sure. And I take you are not able to understand what "respectable" means?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
And I take you are not able to understand what "respectable" means?

That is totally subjective, console type graphics may be respectable to many. In my case, respectable is at-least medium settings at native resolution.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
AMD will not be able to release a faster APU (GPU performance) until 2013 as I have explained earlier...

I don't agree with this at all.

Ok, can you explain how they will increase the performance without enlarging the die size in the same 32nm lithography?
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Ok, can you explain how they will increase the performance without enlarging the die size in the same 32nm lithography?

Well in AMD's case they can always "swap out" GPU cores for upgraded models.

I don't think the discreate GPU market will go away but it will slim down a bit.

The nice thing about "APU"s is that when you replace/upgrade your CPUs the IGP & memory conroller will be upgraded as well.

Now it's died down a bit but some of you guys ragging about Intel's IGPs must not be reading the article on Sandy Bridge or how even the current "Intel HD video" on die video cores are already fast enough for Blu-ray video. Sandy Bridge has already made a big jump over that and added support for various codecs & new APIs.

New Intel Atom CPUs based on Sandy Bridge tech will not have any need for Nvidia ION GPUs. AMD's Atom counterpart based on Bobcat will also have no need for a discreate GPU.

Another telling sign is that Dlink dropped the Tegra 2 from the Boxee Box in favor of a new Intel Atom because Tegra couldn't handle the 1080p Dlink needed.

Also to touch on DX11 support:
http://www.techradar.com/news/compu...intel-dx11-graphics-around-the-corner--716359
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Reading over the OP and..... wow.

First off- x86 CPUs are in the realm of 'Epic fail' in terms of console CPU standards. Will Sony even consider launching a console with a CPU capable of 1/4 of the throughput of their previous generation? All BC is out the window, you can't dream of emulating a chip you can't come close to matching in raw number crunching. Nothing like what you are talking about has ever been considered by any console maker.

3. Next gen console hardware needs, with a single display 1080P target, will be achievable with an off the shelf (or custom designed FOR consoles) APU chip by the time they are introduced, providing hardware commoditization and amortization across platforms.

You won't ever see the console makers using identical hardware, it will never happen. If the console makers all agreed upon a fixed hardware platform and used off the shelf components anyone could make them, which would eliminate any claims to royalties that the console makers had which is how they make their money in the business. MS isn't making back those billions of dollars with their very slim lineup of first party games, they are making money because every EA game you buy for the 360, they get a cut. That is how the console business works. Even if some fairy comes down and eliminates the business realities of the world as it is, you think you are going to get MS, Sony and Nintendo to agree on a standard as complex as an entire console system? I'd give, at best, even odds that they would manage to agree what day of the week it is. *FAR* too much ego between the three of them to even make it remotely viable(each one of them is equally guilty in their own way IMO).

To the idea of an IGP going toe to toe with the next gen consoles at launch, when are IGPs going to compete with the 5 year old consoles?

http://www.anandtech.com/show/3871/the-sandy-bridge-preview-three-wins-in-a-row/7

There is Sandy Bridge running lower settings then the consoles on several games and not running them all that great. This the latest and greatest Intel has on tap to come out running games that a five year old console runs better. Sometime within the next few years you expect them to overcome a console sized generational gap and then some? To say unlikely is like saying that I'm unlikely to solve cold fusion before I finish typing this post.

But what so many fail to realize this is a differant game now. This is for all the marbles.

I have to disagree with this statement. No matter what you say I do not believe that Intel is going to implode and fail as horribly as their Larrabee debacle. Intel has proven they can be outlandishly foolish spending billions and recover. Just because they suffered one absolute, utter, complete embarassment that would make even a start up IP company look foolish, they have the resources to pull through it just fine. Look at Itanic, and that was supposed to be for all the marbles too(and the reason why Intel wasn't going to back x86-64). Obviously Intel failed their way out of high end graphics before they even got started, but at least for the next decade x86 should be able to keep them doing fairly decent business. ARM won't gain enough of a foothold to completely wipe Intel out for a long while even if things keep going as perfectly for them as they have for the last decade.

Another telling sign is that Dlink dropped the Tegra 2 from the Boxee Box in favor of a new Intel Atom because Tegra couldn't handle the 1080p Dlink needed.

Tegra 2 is a cell phone chip, not a laptop competitor ;)
 

animekenji

Member
Aug 12, 2004
85
0
0
CPU's with built in GPU hardware are going to kill the market for separate graphics chips at the lowest end of the market which is more price sensitive than at the top. There will always be a demand for the ability to upgrade components to increase performance so high end discrete video cards and onboard graphics will still be there but low end and some mainstream discrete GPU's are going to disappear. AMD, and to a lesser extent Intel, are ideally positioned for this change. Intel needs to kick their game up a notch, though, because their existing IGP solutions still leave a lot to be desired compared to AMD and nVidia. This leaves nVidia in a hard place because all they make are graphics processors and integrated chipsets, for the most part and who would they partner with when the two biggest CPU manufacturers are preparing their own APU's already? nVidia is going to be the odd man out in this scenario. Even if nVidia manages to find a partner, two companies lack the vertical integration that comes with being one company that can do it all. With all the trouble nVidia is already in with Fermi still having major power consumption and heat issues, HD6000 coming in two weeks with no effective competition and Kepler likely going to be released at the same time as HD7000 I can't see them being a major player for much longer.
 
Last edited:

animekenji

Member
Aug 12, 2004
85
0
0
I don't believe in such a short period of time that AMD would paint themselves into a corner by locking Nvidia out of their motherboards as far as equality of performance goes between various families of graphics cards. From what I understand Fusion is primarily going to be aimed at the lower to low-mid OEM market with add in boards still being the choice amongst gaming enthusiasts. In the more distant future I can see a better prospects for AMD and Intel APU platforms, but I don't think its going to happen quickly enough to handicap Kepler or Maxwell. AMD doesn't really have the luxury of pushing potential Nvidia buyers into Intel's arms. I'm also pretty sure Nvidia is keeping its finger on the pulse of all this and aren't just going to go quietly into the sunset. They must have contingency plans in the works to mitigate any decrease in marketshare from Fusion type platforms. As far as TWIMTBP and Physx goes, if anything they're being more prevalent and people have been hand-wringing over the supposed impact of console accommodation with PC games for some time. It really hasn't been much of a problem yet.

Ah, but you fail to see the subtlety. If AMD releases Fusion with the ability to hybrid Crossfire with ANY of their PCIe graphics cards, instead of like it is now where hybrid Crossfire only works with the lowest end cards, then why would anyone want to put an nVidia card in their AMD based system? They would be locked out from combining their available GPU resources if they did that, resulting in lower performance. Adding an AMD graphics card would be a much better value proposition over an otherwise equivalent nVidia card.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
CPU's with built in GPU hardware are going to kill the market for separate graphics chips at the lowest end of the market which is more price sensitive than at the top. There will always be a demand for the ability to upgrade components to increase performance so high end discrete video cards and onboard graphics will still be there but low end and some mainstream discrete GPU's are going to disappear. AMD, and to a lesser extent Intel, are ideally positioned for this change. Intel needs to kick their game up a notch, though, because their existing IGP solutions still leave a lot to be desired compared to AMD and nVidia. This leaves nVidia in a hard place because all they make are graphics processors and integrated chipsets, for the most part and who would they partner with when the two biggest CPU manufacturers are preparing their own APU's already? nVidia is going to be the odd man out in this scenario. Even if nVidia manages to find a partner, two companies lack the vertical integration that comes with being one company that can do it all. With all the trouble nVidia is already in with Fermi still having major power consumption and heat issues, HD6000 coming in two weeks with no effective competition and Kepler likely going to be released at the same time as HD7000 I can't see them being a major player for much longer.

I bet they are getting the wood ready to board up the windows!
 

crazylegs

Senior member
Sep 30, 2005
779
0
71
I'm not in a position to say how possible this would be and this probably isn't the right thread for it but the way things are going...

- Graphics are getting moved over / integrated with CPU, away from MB.

- Not going to equal or surpass a dedicated card, surely this is just common sense? The physical requirements would not allow this, space, heat, power etc. (But ofc they may well be fine for 99% of other stuff except gaming)

- This is where i apologise if i'm talking out of my ass, or if this has been said before. If there in the near future there is a more accepted open 'physics' engine (imo ATI/AMD + Intel, could mutually benefit from an open standard to replace NV 'PhysX' right?). If that happens then would it not be PERFECT to use the on die GPU dedicated to the 'physics' stuff, CPU carries on number crunching and dedicated GPU keeps doing its job, pumping a shit tonne of pixels so we can all be looking at 10' wide 3D-OMG screens :)

- Or if AMD/ATI just get there act together with their alternative to PhysX and they offer a CPU/GPU combination that does the above, when combined with a dedicated AMD/ATI GPU... that would seem a very attractive option...

Just a thought.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
With architectural improvement on a mature 22nm process, 2H 2013 high end APUs should see at least 5870 class graphics performance. That can handily run nearly all current pure high end computer games at 1080P on highest settings and ALL console derived games at highest settings. Next gen consoles graphics are not going to exceed current state of the art computer graphics, so that 2013 APU should be able to comfortably run any next gen console games that are released, which means it will comfortably run all but a few of the games developers will be putting out, because the next gen console will continue the trend of defining computer game graphics.

2013 is not far away. Nvidia is currently in the early stages of designing 2013's generation of graphics chip and well into designing their 2012 (maxwell) chip. But if there is no consumer market for that chip, why would they be compromising their chips by designing in features for the consumer market instead of designing purely for the professional and HPC markets

You are comparing 2013 APU graphics capability against today's resolutions. This is wrong. UHD is under development, and could be here sometime between 2013 and 2015. There is talk of something like 7K hor. res. That's a lot of pixels to push, and assuming UHD happens, then discrete GPUs will still be needed for performance gaming. APUs will be relegated to laptops and low-end desktops, just like today, integrated graphics for gaming is hardly capable of player sufficiently at 1024x768, a far cry from today's 1920x1200 res.
 

psoomah

Senior member
May 13, 2010
416
0
0
You are comparing 2013 APU graphics capability against today's resolutions. This is wrong. UHD is under development, and could be here sometime between 2013 and 2015. There is talk of something like 7K hor. res. That's a lot of pixels to push, and assuming UHD happens, then discrete GPUs will still be needed for performance gaming. APUs will be relegated to laptops and low-end desktops, just like today, integrated graphics for gaming is hardly capable of player sufficiently at 1024x768, a far cry from today's 1920x1200 res.

1080P will be the mass market resolution for AT LEAST the next decade and it's a dead certainty the next gen consoles will be targeting 1080P as the main resolution.

"just like today" ... ... because technology moves at warp speed? because Intel and AMD are entering an APU graphics war to determine who dominates that market? Why?
 

psoomah

Senior member
May 13, 2010
416
0
0
Ah, but you fail to see the subtlety. If AMD releases Fusion with the ability to hybrid Crossfire with ANY of their PCIe graphics cards, instead of like it is now where hybrid Crossfire only works with the lowest end cards, then why would anyone want to put an nVidia card in their AMD based system? They would be locked out from combining their available GPU resources if they did that, resulting in lower performance. Adding an AMD graphics card would be a much better value proposition over an otherwise equivalent nVidia card.

While AMD has only disclosed the cpu portion of the extensive power stepping/saving technologies they're building into their Fusion chips, they are undoubtably doing the same for the gpu portion of the chips, and it logical to assume into NI gpus and beyond, so that a Fusion system would finely step power usage to supply just what was needed from the onboard grahics seamlessly through offboard AMD graphics cards.

Maximized power efficiency.

They may not be fully there with Fusion 1st gen and 6xxx, but they will be very close by Fusion 2 and 7xxx.

There had to be a reason at CES they proudly touted all those power efficiency technologies they were engineering into the Fusion cpu and said nothing whatsoever about what they were doing with the gpu.

I'm guessing it's a little surprise for Nvidia.
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
You won't ever see the console makers using identical hardware, it will never happen. If the console makers all agreed upon a fixed hardware platform and used off the shelf components anyone could make them, which would eliminate any claims to royalties that the console makers had which is how they make their money in the business. MS isn't making back those billions of dollars with their very slim lineup of first party games, they are making money because every EA game you buy for the 360, they get a cut. That is how the console business works. Even if some fairy comes down and eliminates the business realities of the world as it is, you think you are going to get MS, Sony and Nintendo to agree on a standard as complex as an entire console system? I'd give, at best, even odds that they would manage to agree what day of the week it is. *FAR* too much ego between the three of them to even make it remotely viable(each one of them is equally guilty in their own way IMO).

;)

The console makers have ABSOLUTE control to the gateway of their consoles and any existing mod loopholes will be DRMed out of existence with the next gen hardware. That's all they need. They could all use identical cgu/gpu hardware with each getting their own hardware coded drm version of the chip. And why not? Nintendo's steamrolling success proved it's not about who has the best graphics.

Ten years is a very long time in hardware, and the target IS still single display 1080P. There will be no need to get ahead of the engineering curve ala cell technology because off the shelf will provide sufficient processing power at the resolution target. Why bust your engineering/cost balls and run in the red for five years in an vain attempt to claim 'best graphics' in the next generation like Sony did in the last generation? I guarantee THAT isn't going to happen again with Howard Stringer at the helm. All he cares about is the projected console life profits.

Will a common hardware/development platform happen? Who knows. I'm saying there are compelling reasons for it to happen.
 

praktik

Member
Jan 23, 2009
40
0
0
You won't ever see the console makers using identical hardware, it will never happen. If the console makers all agreed upon a fixed hardware platform and used off the shelf components anyone could make them, which would eliminate any claims to royalties that the console makers had which is how they make their money in the business.

The console makers have ABSOLUTE control to the gateway of their consoles and any existing mod loopholes will be DRMed out of existence with the next gen hardware. That's all they need. They could all use identical cgu/gpu hardware with each getting their own hardware coded drm version of the chip. And why not? Nintendo's steamrolling success proved it's not about who has the best graphics.

Ten years is a very long time in hardware, and the target IS still single display 1080P. There will be no need to get ahead of the engineering curve ala cell technology because off the shelf will provide sufficient processing power at the resolution target. Why bust your engineering/cost balls and run in the red for five years in an vain attempt to claim 'best graphics' in the next generation like Sony did in the last generation? I guarantee THAT isn't going to happen again with Howard Stringer at the helm. All he cares about is the projected console life profits.

Will a common hardware/development platform happen? Who knows. I'm saying there are compelling reasons for it to happen.

Paging Trip Hawkins!

As a proud 3DO owner I lived the dream - only to see it dashed.

It's a great idea but I just don't see it happening. Plus, the idea of a common platform is vaguely socialist isn't it?

Sounds kinda un-american to me..;)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
So Fusion will act like a 5XXX?

Consider that fusion isn't done for the sake of graphics, it is done for the sake of evolving the compute model to one that leverages the GPU architecture for executing massively parallel instructions on data.

The graphics aspect of a fusion product will always trail the graphics aspect of a dedicated discreet IC for simple fact that the CPU portion of the fusion chip occupies die-space and power/thermal budget which the discreet GPU IC doesn't have to share.

For applications though that do take advantage of the APU model, the communication topology between cpu and gpu only stands to benefit by being on the same die for all the same reasons integration the FPU with the ALU was a good idea as well as integrating that standalone SRAM cache from days of yore.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91

I thought you would pick up on the facetiousness of my question. :)


Actually I was trying to get clarification, because nobody who understands anything about hardware could possibly think either Intel or AMD is going to have a non-discreet solution that trails add in board solutions by only a generation.

I figured I had interpreted the poster's statement incorrectly.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
You won't ever see the console makers using identical hardware, it will never happen.

Just an fyi the 36o & PS3 actually use the same IBM power PC CPU core. The difference is where they went from there.

Sony actually wanted the GPU integrated into the CPU. They just couldn't make it work the way they wanted so Nvidia stepped up and solved the problem in a very limited time frame.

MS & Sony might not use off the shelf CPUs but you can bet both are at least looking at CPU+GPU hybrid chips for their consoles.

Tegra 2 is a cell phone chip, not a laptop competitor ;)
Calling the Tegra 2 a cell phone chip is a cheap cop out. I'm actually a fan of the Zune series with the Tegra chipset. That doesn't change the fact that Tegra 2 is limited compared to other ARM brands and is falling behind the mobile CPU curve. Hence the announcement of the Tegra 3 & 4 road maps. The Tegra series still hasn't bee widely successful either.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
They could all use identical cgu/gpu hardware with each getting their own hardware coded drm version of the chip. And why not? Nintendo's steamrolling success proved it's not about who has the best graphics.

I wouldn't call Nintendo's success a steamrolling yet for the generation. They made it out of the gate very quickly, but their sales trend is very poor right now. The HD twins have about 60% of the market this year- versus about 45% a couple of years ago, and the trend is accelerating in their favor. Not saying the Wii is doing poorly in any way, but it appears that Nin is going to have to launch a new platform some time prior to the others which significantly changes the ROI involved in such a platform not to mention that their tie rate isn't comparable to the HD twins. In the mainstream Nin has done exceptionally well, but mainstream players buy a lot less games and play a lot less often then the hardcore market the HD twins cater to. Tie rate is the biggest factor as that it what drives profits for MS and Sony, Nin lives on first party sales which dominates the financials for their platform and as such they have made loads of cash(with multiple titles hitting the 20 million sold realm, Sony and MS have never had a first party title hit that number).

Ten years is a very long time in hardware, and the target IS still single display 1080P.

The target is 1080p 3D, that's effectively 1920x1080 w 4x AA @120fps.

http://www.tomshardware.com/reviews/geforce-gtx-480-3-way-sli-crossfire,2622-11.html

Tri SLI GTX 480s, the fastest graphics setup money can buy, doesn't hit half the performance level of a game that will be considered rather dated by the time the next gen consoles hit. Crossfire 5870s can't hit four frames per second at that setting, less then 1/30th of the performance target for the next gen consoles. You honestly think APUs are going to be 30 times faster then a 5970 when the next gen consoles launch? Again, this is a title that will be considered outdated by the time they do come out.

Why bust your engineering/cost balls and run in the red for five years in an vain attempt to claim 'best graphics' in the next generation like Sony did in the last generation?

Sony's cost problem was the BluRay drives which they were leveraging to win a 'format war'. They won that 'war' btw. Their graphics chips are quite cheap(easily observable by looking at the small impact they have on nV's bottom line versus the number of PS3s sold).

For Cell, right now they are at 115mm die size with four times the computational power of the fastest i7 in terms of raw SP FLOPS, the i7 pushing 263mm die size with less computational power. Sony also has full rights to the Cell IP which means they don't need to pay extra to get it fabbed for them. A chip less the half the size which they own the rights to with ~4x the peak throughput of the top x86 part. It shouldn't take people too long to figure out why x86 doesn't stand a chance in the console market. If the 6xxx series of parts shipped and they were four times faster then the GTX 480 and cost significantly less, what do you think that would do to nVidia? That is precisely what Intel or AMD face trying to get into the console market. Too expensive, too large, and too slow.

Will a common hardware/development platform happen?

No chance. Even if their were some miniscule possibility of it happening, it would be based around a POWER derivative and not x86. The PC is currently selling about 40 million games per year, consoles are selling 400 million. The consoles are currently all running PPC/POWER based chips- if their is a common platform, it will be using IBM's flavor of processors, not Intel or AMD's.

So from a business point of view, x86 is slower, larger, hotter and more expensive with the only benefit being ease of porting to the PC. Why would anyone take the x86 parts? Strictly from a cost perspective, going with a PowerVR licensed design along with using Cell IP would be the cheapest option they could get out of the gate, still nothing to do with using AMD or Intel.

Just an fyi the 36o & PS3 actually use the same IBM power PC CPU core.

One of Cell's cores is close in design to the 360's CPU, the chips are extremely different outside of that though. I have read the book btw.

Sony actually wanted the GPU integrated into the CPU.

Sony wanted to use software rendering ala Larrabee. Their idea was to use dual Cell's, one for CPU work and one for graphics rendering. KK was rather vocal and open about this, it was an utterly stupid idea that never had a chance of working, the realized this late in development and decided to go to nV for a quick fix. They never had intentions of using a traditional GPU integrated on die with their CPU.

That doesn't change the fact that Tegra 2 is limited compared to other ARM brands and is falling behind the mobile CPU curve.

Really, that is interesting. Could you please explain why every cell phone maker besides RIM and Apple have announced their next halo phones as being Tegra2 powered? I'm rather curious here, most of them have also announced Tegra2 tablets as being their high end option.
 
Last edited:

golem

Senior member
Oct 6, 2000
838
3
76
Calling the Tegra 2 a cell phone chip is a cheap cop out. I'm actually a fan of the Zune series with the Tegra chipset. That doesn't change the fact that Tegra 2 is limited compared to other ARM brands and is falling behind the mobile CPU curve. Hence the announcement of the Tegra 3 & 4 road maps. The Tegra series still hasn't bee widely successful either.

Aren't most if not all ARM chips used in cellphones or similiar low power devices? Why would calling Tegra a cellphone chip be a cop out then?
 

psoomah

Senior member
May 13, 2010
416
0
0
I thought you would pick up on the facetiousness of my question. :)


Actually I was trying to get clarification, because nobody who understands anything about hardware could possibly think either Intel or AMD is going to have a non-discreet solution that trails add in board solutions by only a generation.

I figured I had interpreted the poster's statement incorrectly.

AMD said at the last CES their express intention was to update their fusion chips annually incorporating the previous generation's graphics architecture.

So ... I guess you're saying AMD doesn't know anything about hardware?

Think about it. AMD is entering an APU arms race with Intel. Why leave ammunition laying on the table?
 
Last edited: