Do Kepler and Maxwell have an AIB future?

psoomah

Senior member
May 13, 2010
416
0
0
I doubt they do based on the following points:

1. AMD and Intel's focus on APUs which become more graphics capable each generation with AMD in particular able to smoothly and seamlessly scale between onboard and off board graphics, putting Nvidia AIBs at a disadvantage with Fusion based boards.

2. Consolidation created behemoth game developers insistent on a unified development platform for console and computer games and the continuing move across developers to program to consoles first and computers as an afterthought. The next gen console graphics standard will become much more set in stone for computer games than at present.

3. Next gen console hardware needs, with a single display 1080P target, will be achievable with an off the shelf (or custom designed FOR consoles) APU chip by the time they are introduced, providing hardware commoditization and amortization across platforms. Everyone can sell their consoles for a profit from the start with the next generation. And do so CHEAPLY. It's all about getting consoles into the hands of the consumers for the lowest price and staying in the black doing it.

4. Ken Kutagari is gone and Howard Stringer was installed specifically to bypass the ingrained face saving/consensus culture that would prevent Sony from joining a far more profitable common hardware/development consensus. If it's going to COST profits, why do it?

5. Nintendo facing far stiffer casual gamer competition into the future, and the logic in being able to share in the considerable harder core gaming profits will be compelling. It is likely all three consoles will become full spectrum gaming, media streaming platforms.

6. AMD's unique positioning to provide that console APU solution and the industry wide move to OpenCL and OpenGL that would accompany it would leave Physx and TWIMTBP out in the gaming cold.

7. Fully integrated Intel and AMD APUs is clearly where x86 computing is headed, with APU add-in boards the upgrade path, eventually leaving Nvidia with no niche at all in x86 computing, which the developers and console manufafcturers are fully cognizant of. They might as well get with the program with the next gen consoles.

All of these points mitigate against Nvidia's consumer graphics and AIB future in general and Kepler and Maxwell in particular. Facing the impending release of NI, Nvidia is likely to keep losing mindshare and marketshare until Kepler comes out, presumedly in late 2011, by which time Fusion will be well launched. In another two years Nvidia will be effectively locked out of the AMD platform altogether. They'll still have Intel, but that AMD mind and market share and almost certain continuing ferocious price/performance competition from AMD isn't going to allow that to be very profitable.

Where is the future for Kepler and Maxwell in the AIB market?
 
Last edited:

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
I don't believe in such a short period of time that AMD would paint themselves into a corner by locking Nvidia out of their motherboards as far as equality of performance goes between various families of graphics cards. From what I understand Fusion is primarily going to be aimed at the lower to low-mid OEM market with add in boards still being the choice amongst gaming enthusiasts. In the more distant future I can see a better prospects for AMD and Intel APU platforms, but I don't think its going to happen quickly enough to handicap Kepler or Maxwell. AMD doesn't really have the luxury of pushing potential Nvidia buyers into Intel's arms. I'm also pretty sure Nvidia is keeping its finger on the pulse of all this and aren't just going to go quietly into the sunset. They must have contingency plans in the works to mitigate any decrease in marketshare from Fusion type platforms. As far as TWIMTBP and Physx goes, if anything they're being more prevalent and people have been hand-wringing over the supposed impact of console accommodation with PC games for some time. It really hasn't been much of a problem yet.
 
Last edited by a moderator:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
deleted quote

Modified post:

High-end discrete GPUs aren't going to vanish anytime soon, though low-end ones look to die off as Sandy Bridge and Fusion take off. Not economical to make everybody buy a huge GPU on-die, not when many people don't game at all. Not to mention pro-grade discrete cards.
 
Last edited:

distinctively

Junior Member
Feb 13, 2009
18
0
0
FUD. You're thread at Semi-Accurate seems to have found a sympathetic audience. Big surprise there.

I don't think that thread would be considered "sympathetic". It barely stays on topic for a page. I do think that APUs are going to make nVidia's future more difficult in the gaming arena. They seem to be shifting/balancing focuses towards SOC and professional divisions for good reason. With Intel moving into a more competitive position, nVidia will have very tough competition in the low end. This is probably why nVidia is designing chips to be ever more GPGPU friendly.

I think the future is gonna be really interesting and players may continue to change positions for quite a while.
 

psoomah

Senior member
May 13, 2010
416
0
0
FUD. You're thread at Semi-Accurate seems to have found a sympathetic audience. Big surprise there.

Considering the next generation graphics picture by extrapolating into the future based on current data and logic is not FUD unless the arguments lack factual validity, coherence and logic.

I consider I raised valid points and presented a valid conclusion.

If you disagree with the merit of my points, present your counterpoints.

If someone has an issue with a thread, arguing the points raised on their merit is the accepted rational response and adds value to the forum. Attacking a thread without addressing the merits of it's points is the irrational response and adds nothing of value to the forum.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Considering the next generation graphics picture by extrapolating into the future based on current data and logic is not FUD unless the arguments lack factual validity, coherence and logic.

I consider I raised valid points and presented a valid conclusion.

If someone has an issue with a thread, arguing the points raised on their merit is the accepted rational response. Attacking a thread without addressing the merits of it's points is the irrational response.

Okay technically not FUD I suppose, but what's your point? We've all known about fusion for ages now, and I'm sure that NV's AIB partners have known it, too. It's not news that Sandy Bridge and Fusion will destroy the low-end discrete video card market. But that leaves the mid-range and high-end. Not to mention that NV practically has a monopoly on pro-grade graphics. HPC and mobile are ways in which NV is attempting to make up for the imminent demise of low-end discrete cards and the demise of their chipset business. AIB partners like EVGA are already diversifying into stuff like mobos. If they don't evolve then they may end up in a bad place or out of business or something, but the stronger AIBs will still be around.
 

psoomah

Senior member
May 13, 2010
416
0
0
Okay technically not FUD I suppose, but what's your point? We've all known about fusion for ages now, and I'm sure that NV's AIB partners have known it, too. It's not news that Sandy Bridge and Fusion will destroy the low-end discrete video card market. But that leaves the mid-range and high-end. Not to mention that NV practically has a monopoly on pro-grade graphics. HPC and mobile are ways in which NV is attempting to make up for the imminent demise of low-end discrete cards and the demise of their chipset business. AIB partners like EVGA are already diversifying into stuff like mobos. If they don't evolve then they may end up in a bad place or out of business or something, but the stronger AIBs will still be around.

The points I raised address Nvidia losing the midrange and high end consumer markets also.
 
Last edited:

swerus

Member
Sep 30, 2010
177
0
0
As long as they make the fastest single GPU available to the public they will never loose the high end market.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The points I raised address Nvidia losing the midrange and hig end consumer markets also.

I know and I am disagreeing; I think complete fusion may happen but not in a year or even 3. High-end GPUs are huge compared to CPUs.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
i say that we're a long way from seeing cpu+gpu packages beating high performance gpus. Nvidia's safe IMO
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
So from what you are saying, AMD will not introduce another Discrete GPU Family after 6000. :p

Discrete GPUs will continue to coexist with APUs as they coexisted with integrated Graphics (OnBoard) up until now. ;)
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Agreed. High-end discrete GPUs aren't going to vanish anytime soon, though low-end ones look to die off as Sandy Bridge and Fusion take off.


I think it will take a while for low-end videocards to die off. Just because Sandy Bridge and Fusion are available doesn't mean that everyone will replace their computers with it immediately.

I estimate about 3 years before you see a significant drop off in low end videocards. Maybe 5 years or more before the market vanishes/becomes tiny
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I think it will take a while for low-end videocards to die off. Just because Sandy Bridge and Fusion are available doesn't mean that everyone will replace their computers with it immediately.

I estimate about 3 years before you see a significant drop off in low end videocards. Maybe 5 years or more before the market vanishes/becomes tiny

I agree and should have been clearer. SB and Fusion are not even available yet, and it may be a while till we see nothing but APU-style chips. I did not mean that low-end discrete cards would immediately die off, but that they would die off as APUs take off over a period of time.

In any case this doesn't help OP's argument--it actually further hurts OP's argument.
 

psoomah

Senior member
May 13, 2010
416
0
0
I don't think that thread would be considered "sympathetic". It barely stays on topic for a page.

I consider every point I raised has a direct bearing on the question the thread post asked. If that is so, the entire post stayed on topic.

If you consider this is not so please address which point(s) you consider were sufficiently off topic to cause you to say that.

I am interested in reasoned, differentiated, logic based discussions on what is admittedly a slightly controversial subject matter, but a perfectly valid (and interesting) one when presented in a point by point manner as I did.

THAT kind of discussion is what stretches the mind and makes Forums fun and interesting.

At least it is so for me.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Folks, psoomah has been more than accommodating and tolerant of the thread-crapping going on here.

This is a technical forum, please conduct yourselves accordingly.

Moderator Idontcare

Considering the next generation graphics picture by extrapolating into the future based on current data and logic is not FUD unless the arguments lack factual validity, coherence and logic.

I consider I raised valid points and presented a valid conclusion.

If you disagree with the merit of my points, present your counterpoints.

If someone has an issue with a thread, arguing the points raised on their merit is the accepted rational response and adds value to the forum. Attacking a thread without addressing the merits of it's points is the irrational response and adds nothing of value to the forum.

I consider every point I raised has a direct bearing on the question the thread post asked. If that is so, the entire post stayed on topic.

If you consider this is not so please address which point(s) you consider were sufficiently off topic to cause you to say that.

I am interested in reasoned, differentiated, logic based discussions on what is admittedly a slightly controversial subject matter, but a perfectly valid (and interesting) one when presented in a point by point manner as I did.

THAT kind of discussion is what stretches the mind and makes Forums fun and interesting.

At least it is so for me.
 

psoomah

Senior member
May 13, 2010
416
0
0
Agreed. High-end discrete GPUs aren't going to vanish anytime soon, though low-end ones look to die off as Sandy Bridge and Fusion take off. Not economical to make everybody buy a huge GPU on-die, not when many people don't game at all. Not to mention pro-grade discrete cards.

AMD is developing OpenCL and OpenGL with Fusion specifically in mind. This is stated on their website. Makes sense as they consider Fusion/APUs THE future of the company. Intel is working along similar lines. AMD's stated intention is to release Fusion chips on a yearly basis with the graphics portion derived from the previous years graphics architecture. For our purposes, this starts with the first Llano chip in 2011.

So in 2013, high end Fusion chips will contain 2012 based GPU architecture.

2010 - 6xxx
2011 - 7xxx
2012 - 8xxx
2013 - 9xxx

Llano will be released with 5xxx graphics due to process delays, but they can leapfrog to 7xxx graphics on their 2nd generation Fusion chips coming in 2012, and incorporate 8xxx graphics on their 3rd generation Fusion chips.

They will also increase the 'fusing' of the cpu and gpu with each new APU generation.

It's logical to extrapolate third generation APUs will be highly fused on a maturing 22nm process with 8xxx class graphics.

By this time OpenCL and OpenGL will be far advanced from where they are now and software increasingly and seamlessly using the entire capability of the APU. Once this ecosystem is sufficiently advanced, increasing processing capacity of a computer will switch to adding plug in APU boards to the computer and discrete gpu's will stop being developed and existing boards will service legacy computers and eventually die out.

Intel will progress in a similar manner and they don't have an existing discrete graphics market to consider, they will add graphics capability to their APUs just as fast as their engineers and designers can make it happen.

Between now and THAT however are several intermediate steps, each of which will increasingly freeze Nvidia out of the gpu market.

If a rule of thumb is Fusion APUs incorporate 1/4 of the previous generations top gpu, 2013, at a minimum = 1/4 of the 7870 gpu, which, coming on a new process, can be expected to double the 6870 gpu which it appears is going to be in the 5970 ballpark. It could also be 1/4 of the 8870 gpu, which means ~ 5970 graphics.

1/4 of 7870 = 1/2 of 6870 = 1/1 of 5870.
1/4 of 8870 = 1/2 of 7870 = 1/1 of 6870 = 5970.

Could be either one, depending, but 5870 at a minimum.

With architectural improvement on a mature 22nm process, 2H 2013 high end APUs should see at least 5870 class graphics performance. That can handily run nearly all current pure high end computer games at 1080P on highest settings and ALL console derived games at highest settings. Next gen consoles graphics are not going to exceed current state of the art computer graphics, so that 2013 APU should be able to comfortably run any next gen console games that are released, which means it will comfortably run all but a few of the games developers will be putting out, because the next gen console will continue the trend of defining computer game graphics.

2013 is not far away. Nvidia is currently in the early stages of designing 2013's generation of graphics chip and well into designing their 2012 (maxwell) chip. But if there is no consumer market for that chip, why would they be compromising their chips by designing in features for the consumer market instead of designing purely for the professional and HPC markets?

My reasoning bring me to the conclusion Nvidia is between a rock and a hard place with no crowbar in sight.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
AMD is developing OpenCL and OpenGL with Fusion specifically in mind. This is stated on their website. Makes sense as they consider Fusion/APUs THE future of the company. Intel is working along similar lines. AMD's stated intention is to release Fusion chips on a yearly basis with the graphics portion derived from the previous years graphics architecture. This of course starts with the first Llano chip in 2011.

So in 2013, Fusion will contain 2012 based GPU architecture.

2010 - 6xxx
2011 - 7xxx
2012 - 8xxx
2013 - 9xxx

Llano will be released with 5xxx graphics due to process delays, but they can leapfrog to 7xxx graphics on their 2nd generation Fusion chips coming in 2012, and incorporate 8xxx graphics on their 3rd generation Fusion chips.

They will also increase the 'fusing' of the cpu and gpu with each new APU generation.

It's logical to extrapolate third generation APUs will be highly fused on a maturing 22nm process with 8xxx class graphics.

By this time OpenCL and OpenGL will be far advanced from where they are now and software increasingly and seamlessly using the entire capability of the APU. Once this ecosystem is sufficiently advanced, increasing processing capacity of a computer will switch to adding plug in APU boards to the computer and discrete gpu's will stop being developed and existing boards will service legacy computers and eventually die out.

Intel will progress in a similar manner and THEY don't have an existing discrete graphics market to nurse, they will add graphics capability to their APUs just as fast as their engineers and designers can make it happen.

Between now and THAT however are several intermediate steps, each of which will increasingly freeze Nvidia out of the gpu market.

If a rule of thumb is Fusion APUs incorporate 1/4 of the previous generations top gpu, so 2013 = 1/4 of the 78xx gpu, which, coming on a new process can be expected to double the 68xx gpu which it appears is going to be in the 5970 ballpark.

1/4 of 7870 = 1/2 of 6870 = 1/whole of 5870. With architectural improvement on a mature 22nm process, 2H 2013 high end APUs should see 5870 class graphics performance. That can handily run current pure high end computer games at 1080P highest settings. Nex gen consoles graphics are not going to exceed current state of the art computer graphics, so that 2013 APU should be able to comfortable run any next gen console games that are released, which means it will comfortable run all but a few of the games developers will be putting out, because the next gen console will continue the trend and define the graphics of all but a few cutting edge computer games.

2013 is not far away. Nvidia is currently in the early stages of designing 2013's generation of graphics chip and well into designing their 2012 (maxwell) chip. But if there is no consumer market for that chip, why would they be compromising their chips by designing in features for the consumer market instead of designing purely for the professional and HPC markets?

Nvidia is between a rock and a hard place, and no crowbar in sight.

I actually don't disagree with the general thrust of Fusion, just with the timeline. You specifically singled out Kepler and Maxwell, which I think is a bit premature for reasons I already stated. Maybe if we were talking 5, 10, or 15 years out...
 

psoomah

Senior member
May 13, 2010
416
0
0
I think it will take a while for low-end videocards to die off. Just because Sandy Bridge and Fusion are available doesn't mean that everyone will replace their computers with it immediately.

I estimate about 3 years before you see a significant drop off in low end videocards. Maybe 5 years or more before the market vanishes/becomes tiny

I would bet Xmas 2012 sees 80% penetration of AMD or Intel APUs into the OEM laptop and desktop markets.
 

psoomah

Senior member
May 13, 2010
416
0
0
I know and I am disagreeing; I think complete fusion may happen but not in a year or even 3. High-end GPUs are huge compared to CPUs.

I did not posit complete fusion would happen in a year or 3.

It would be helpful if you singled out a single point I made and respond with specific rebuttals to specific statements.

I did not understand the context of 'High-end GPUs are huge compared to CPUs'. Please clarify.
 

psoomah

Senior member
May 13, 2010
416
0
0
I actually don't disagree with the general thrust of Fusion, just with the timeline. You specifically singled out Kepler and Maxwell, which I think is a bit premature for reasons I already stated. Maybe if we were talking 5, 10, or 15 years out...

I laid out a timeline to 2H 2013. What specific parts of my reasoning concerning that timeline do you disagree with?

Kepler and Maxwell are specifically the GPU architectures mentioned by JHH that my reasoning led me to conclude would not be viable in the AIB graphics market. If you think it is premature, please address specific points I made with specific rebuttals to those points. Specious generalizations are not condusive to developing reasoned and differentiative discourse.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
This of course starts with the first Llano chip in 2011.

FYI, Ontario and Zacate are the first APU's from AMD with 80 SP's. Llano will be their second APU part in 2Q(?) 2011.

Llano is rumored to have 480 SP's which is higher than 5670 with 400 SP's but Llano's performance should be lower than a 5670 due to obvious reasons.

Now, 5670 is a lower mid range card(1/4th of the high end card as you rightly pointed out). Fast forward two years when AMD APU with a 5870 class GPU, it will still be a lower mid range part in 2012.

Keep in mind, a 5670 is ~15% faster than a 3870, a high end GPU in 2007.
 

psoomah

Senior member
May 13, 2010
416
0
0
FYI, Ontario and Zacate are the first APU's from AMD with 80 SP's. Llano will be their second APU part in 2Q(?) 2011.

Llano is rumored to have 480 SP's which is higher than 5670 with 400 SP's but Llano's performance should be lower than a 5670 due to obvious reasons.

Now, 5670 is a lower mid range card(1/4th of the high end card as you rightly pointed out). Fast forward two years when AMD APU with a 5870 class GPU, it will still be a lower mid range part in 2012.

Keep in mind, a 5670 is ~15% faster than a 3870, a high end GPU in 2007.

My bad on the Llano assumption. I was thinking high end Fusion chip progression as it will change the discrete GPU martket into the future.

A lower mid range card in an eyefinity~2560 monitor enviroment, but not in a 1080P single display enviroment.

Almost all computer games will be developed first for consoles with eyefinity added for the computer version. But even if you have an eyefinity set-up, you can just add a graphics card of your choice to your quad bulldozer core Fusion chipped computer. If you have 3 20" monitors, your Fusion gpu already has you 2/3 of the way home. Just add a low cost AIB and you're there.

AMD stated Eyefinity was developed specifically to absorb what would otherwise become superfluous GPU processing capacity, which is already somewhat applicable with the 5870 at 1080P.
 

psoomah

Senior member
May 13, 2010
416
0
0
Litigation. ;)

The government seems content as long as there are at least two strong competitors in a market. AMD and Intel qualify on that count. Nvidia is superflous as far as the governemnt is concerned.