Tom's Hardware Guide: Financial Analysts Say Intel Killed the Discrete Graphics Card

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Stochastic

Member
Apr 1, 2012
51
0
61
I just see this as an extension of the SoC-ification of the PC. Yes, inevitably the IGP market will obviate the need for lowend GPUs. Arguably, this has already happened with Llano and IVB.

However, as others have said, this won't necessarily mean that the core dedicated GPU market will wither away immediately. Discrete GPUs may become more of a niche, but so long as that niche remains profitable then it won't necessarily go away. I do think that, eventually, discrete GPUs will go the way of sound cards, but that is still probably some time off.

What I think is interesting is a lot of people's reactions to reports like these. Enthusiasts such as ourselves, who once comprised a large portion of PC users, have become more and more of a minority with the ubiquitization of the PC and the transition to mobile consumption systems from stationary productivity ones. Many of the posts in this thread I think are very revealing about enthusiasts attitudes towards these trends. People clearly feel that their hobby is threatened by the "lowest common denominator consumers" who have usurped their position as the primary target of new hardware.

With IVB, Intel has made it very clear that their priority is not necessarily pushing the performance barrier but rather focusing on the mobile space and consolidating PC components into one chip (further evidence of this comes from Intel's new NUC form factor: http://www.anandtech.com/show/5800/slimming-desktops-down-intel-reveals-next-unit-of-computing). Who can blame them honestly? They're just reacting to market trends as any sensible corporation would. AMD is also responding to the new computing world, albeit in their own way: http://www.anandtech.com/show/5503/understanding-amds-roadmap-new-direction .

The rhetoric I'm hearing reminds me of discussions you find on car forums about the push for fuel efficiency and environmental friendliness (listen to Jeremy Clarkson in this Forza 4 trailer: http://www.youtube.com/watch?v=4YyT3SQez2o&ob=av3e ). You hear similar things in PC gaming communities with people lamenting the state of the console-dominated games industry, "dumbification" of games, etc. The thing is, all of these communities still exist, and as long as they do there will probably be products catering to them.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
So in 10 years an integrated GPU will be capable of playing the latest games at a minimum of 1080P minimum 60FPS on at least high?

LMAO i doubt they will be able to runBF3 on those settings at that performance level. Considering resolutions are only going to go up and pretty much everything will be 120hz+ by then I find it even less likely.

10 yers ago geforce 4 series was just released.

IGPs is already faster than consoles that is based on g7900/x1900 variants.

GFX makers are getting very desperate. 3D, 120hz, eyefinity/surround, useless AA modes.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
By the time intel HDn000 will offer the same experience as the lower tier NV,AMD chips available today,NV and AMD's solutions will make intel look like crap yet again.Here i assume PC gaming won't be dead before that of course.

I think my fear isn't that PC gaming would be dead, but high end gaming would be, if not dead, in bad shape.

If the APUs get to a point where they can run console code (and look at the rumors of next gen console hardware) then the devs (who have already shown to cave and cater to to the Lowest Common Denominator) will just go the shovelware route. Hell, porting might be easier due to so many similarities in the hardware.

I tell you what, having to wait almost a year for the DX11 Crysis 2 patch was an eye opener. We, the high end gaming elitist, don't matter much anymore. We sit here in forums and argue with each other about how awesome our team is - yet in the end Angry Birds out sold our best looking game by a margin of a trillion to 1 (exagerrated to make my point.) Next year Angry Bird's 3D will run on an iPad4, and so will Crysis 4.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
However, as others have said, this won't necessarily mean that the core dedicated GPU market will wither away immediately. Discrete GPUs may become more of a niche, but so long as that niche remains profitable then it won't necessarily go away. I do think that, eventually, discrete GPUs will go the way of sound cards, but that is still probably some time off.

Exactly :)
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I think my fear isn't that PC gaming would be dead, but high end gaming would be, if not dead, in bad shape.

If the APUs get to a point where they can run console code (and look at the rumors of next gen console hardware) then the devs (who have already shown to cave and cater to to the Lowest Common Denominator) will just go the shovelware route. Hell, porting might be easier due to so many similarities in the hardware.

I tell you what, having to wait almost a year for the DX11 Crysis 2 patch was an eye opener. We, the high end gaming elitist, don't matter much anymore. We sit here in forums and argue with each other about how awesome our team is - yet in the end Angry Birds out sold our best looking game by a margin of a trillion to 1 (exagerrated to make my point.) Next year Angry Bird's 3D will run on an iPad4, and so will Crysis 4.
++:thumbsup:
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I think my fear isn't that PC gaming would be dead, but high end gaming would be, if not dead, in bad shape.

If the APUs get to a point where they can run console code (and look at the rumors of next gen console hardware) then the devs (who have already shown to cave and cater to to the Lowest Common Denominator) will just go the shovelware route. Hell, porting might be easier due to so many similarities in the hardware.

I tell you what, having to wait almost a year for the DX11 Crysis 2 patch was an eye opener. We, the high end gaming elitist, don't matter much anymore. We sit here in forums and argue with each other about how awesome our team is - yet in the end Angry Birds out sold our best looking game by a margin of a trillion to 1 (exagerrated to make my point.) Next year Angry Bird's 3D will run on an iPad4, and so will Crysis 4.

Maybe I'm just optimistic, but I think stronger non-discrete GPUs are a GOOD thing. If Haswell's top part is as fast in practice as in theory, it'll be about as fast as a GTS 250 or HD4850 or so. Fast forward another couple of years, and even the mainstream Intel CPUs will probably have that level of performance.

Why is this is a good thing?

1. It raises the floor. If everyone has access to DX11 then maybe more gamedevs will use stuff like tessellation, rather than continue to plod along on DX9.

2. It dramatically increases the potential market size for graphically intense PC games, if everybody is forced to buy a decent GPU as part of a CPU. The percentage of computers able to run a modern game acceptably well is probably low right now, but imagine a situation where that number increases by a factor of 10. All of a sudden, the market size for graphically intense PC games increases by up to a factor of 10. For gamedevs, that could lead to more sales, more revenue, more profit, and thus more money to reinvest into making great games.

Yes it could also lead to stagnation if gamedevs figure they'll code for a certain level of graphics, namely, whatever consoles and APUs are capable of. But how is that any worse than the situation today? It can only get better, not worse, if strong APUs are commonplace.

3. Embedded GPUs may be repurposed, e.g., to drive hardware-accelerated physics.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Has the HD4000 caught up with 8800GTX yet?

No, but the theoretical performance of Haswell, if Intel can solve the memory bandwidth problem (among others), is something like a GTS 250 or HD4850--faster than a 8800GTX.

Ivy Bridge is already ~75% as fast as a 5570 and Haswell's top part should have roughly triple the performance of Ivy Bridge in theory, due to having 40 vs. 16 EUs, and probably clocked faster due to a more mature process, and the 40 will be slightly more efficient according to Intel. 225% of a HD5570's performance takes you into the GTS 250 league or faster.
 

Stochastic

Member
Apr 1, 2012
51
0
61
^^I agree with blastingcap. PC gaming only stands to benefit from more people being able to play PC games affordably.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Maybe I'm just optimistic, but I think stronger non-discrete GPUs are a GOOD thing. If Haswell's top part is as fast in practice as in theory, it'll be about as fast as a GTS 250 or HD4850 or so. Fast forward another couple of years, and even the mainstream Intel CPUs will probably have that level of performance.

it won't be bad thing, games will be played, and gamer shouldn't be concernd...
apus might trigger some impressive performance, while cpu and gpu goes closer and closer

the problem is gpu-discreet market, it will be a nitche market for those who want silly effects....yes, like the glory days of 2d cards.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Maybe I'm just optimistic, but I think stronger non-discrete GPUs are a GOOD thing. If Haswell's top part is as fast in practice as in theory, it'll be about as fast as a GTS 250 or HD4850 or so. Fast forward another couple of years, and even the mainstream Intel CPUs will probably have that level of performance.

Why is this is a good thing?

1. It raises the floor. If everyone has access to DX11 then maybe more gamedevs will use stuff like tessellation, rather than continue to plod along on DX9.

DX9 is nearing it's 10th birthday. Who is to say that history won't repeat itself with DX11? By the time iGPUs have the power to run DX11 titles decently, shouldn't we be at DX15 or higher?

2. It dramatically increases the potential market size for graphically intense PC games, if everybody is forced to buy a decent GPU as part of a CPU. The percentage of computers able to run a modern game acceptably well is probably low right now, but imagine a situation where that number increases by a factor of 10. All of a sudden, the market size for graphically intense PC games increases by up to a factor of 10. For gamedevs, that could lead to more sales, more revenue, more profit, and thus more money to reinvest into making great games.

Yes it could also lead to stagnation if gamedevs figure they'll code for a certain level of graphics, namely, whatever consoles and APUs are capable of. But how is that any worse than the situation today? It can only get better, not worse, if strong APUs are commonplace.

I believe the same was said about DX9 going into consoles around 2005/2006 when it was still a relatively new API. I remember looking at Bioshock on 360 and admiring how good it looked for a console game. Well, by 2007/2008 that awe washed away and now in 2012 I'm pretty much hating consoles haha. It's unfortunate that they are dictating what we as PC gamers get.

The other alternative is not one I want to support, where the hardware vendors put hardware specific tweaks into games to promote their product. I understand it's business, but I'd rather not dictate my hardware purchase decisions on which vendor bribed/supported/whatever which dev studio.

I don't think I have that positive outlook when over the course of the last 4 or 5 years, I've only seen some great devs shut their doors, and then this recent interview:

http://www.gamepolitics.com/2012/05...ers-describe-lucasarts-management-psychopaths

Suits are now in control of the games we play, and it seems they are more about cost cutting and money saving than they are graphics and technology.

3. Embedded GPUs may be repurposed, e.g., to drive hardware-accelerated physics.

That is something I'd support, but the question is will it happen? If it does, count me in. Now we just need the dev with brass balls and a silver tongue who can ok it with HQ to venture into unknown terrain that targets only a fraction of a fraction of the market.


I'm excited for COD : Black Ops 2 only because I really enjoy their SP campaigns, but man is that engine dated.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
PC games are quite affordable already.

Are you purposely misinterpreting all of this?

If the masses are roped into buying GPUs whether they want to or not, that should spread out the costs and make PC gaming hardware more affordable.

But to your point:

If the size of the PC gaming market increases, then perhaps gamedevs will pay more attention to it. And if even a few people who would otherwise NOT buy PC games, start buying PC games because they got a "free" GPU, then that means more revenue and profit for PC game developers, which one would hope would lead to a bigger budget for the next game, or lower its price, or whatever, if competition pressures them into doing so.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
i3 2100 is a dual core cpu with HT and it's an excellent gaming cpu. Your excuse to qualify negatively a unreleased chip is getting old. They could surprise us.

On the topic, i think that both Nvidia and AMD would suffer greatly if intel released a better still graphics solution. In other words HD 4000 while very good for today's standard is not good enough. Older tech (Llano) is still better.

How is it better? Much higher power consumption for a decent bump is GPU performance? The difference between HD4000 and Llano in 'if you can play a game at all' is really small. If you can play it on HD4000, it will play a bit better on Llano. If you can play it on Llano, you can play it as well on HD4000. With SB/IB you get much better CPU performance and power consumption.

I am not knocking Llano, just saying you pay for the better GPU with more power usage, and CPU performance is lacking. IB gave us 50% more GPU performance than SB with less power usage.

PD is giving us marginally better GPU with less CPU performance for likely the same power usage. Thats not very encouraging.

If you look at gaming charts, 2-module BD cores are the absolute worst gaming CPU around. They are arguably the only CPU to really avoid for gaming, unless you plan to OC like crazy. They are not comparable to the i3-2100 which is MUCH faster and uses a lot less power.

To sum it up, unless we see massive power consumption decreases with PD, it's a mixed-bad. Definitely nothing to get excited about if you Llano today, and much less attractive than soon to available Celeron and i3 IB CPUs IMHO. They will be good if they are cheaper than existing Llano options, and nothing else.
 
Last edited by a moderator:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Are you purposely misinterpreting all of this?

If the masses are roped into buying GPUs whether they want to or not, that should spread out the costs and make PC gaming hardware more affordable.

But to your point:

If the size of the PC gaming market increases, then perhaps gamedevs will pay more attention to it. And if even a few people who would otherwise NOT buy PC games, start buying PC games because they got a "free" GPU, then that means more revenue and profit for PC game developers, which one would hope would lead to a bigger budget for the next game, or lower its price, or whatever, if competition pressures them into doing so.

My only counter to this is the Nintendo Wii. It introduced gaming (console gaming) to a far wider audience. We can argue about the control scheme being the X-Factor, but in the end the Wii rose to the heavens just as fast as it fell from the sky.

For the real gamers (not the Angry Bird crowd) we got nothing out of it, just copy and paste games.

Where there is a bigger audience, there is more of a reason to cut corners and fake interest. Selling a crappy game to 10,000,000 people will probably get you more sales then selling an awesome game to 1,000 people.

And, I'm not saying we'll be getting worse games, just that the high end portion of it will be very lacking.

EDIT: Since I threw COD above, the quality (to me as a gamer, opinions will differ of course) of the games improved over the years. We went from no-name voice actors to a-list voice actors. The scripts/plots got a little better, sure make the Michael bay jokes, but in the graphics, engine, etc is relatively the same.

I don't think bigger budgets == better looking games, I've just seen them translate to more lavish marketing and more celebrity voice actors.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Are you purposely misinterpreting all of this?

If the masses are roped into buying GPUs whether they want to or not, that should spread out the costs and make PC gaming hardware more affordable.

But to your point:

If the size of the PC gaming market increases, then perhaps gamedevs will pay more attention to it. And if even a few people who would otherwise NOT buy PC games, start buying PC games because they got a "free" GPU, then that means more revenue and profit for PC game developers, which one would hope would lead to a bigger budget for the next game, or lower its price, or whatever, if competition pressures them into doing so.
I didn't misinterpret anything.Now u don't have to pay 60$ to get AAA experience ,u can enjoy quality games at a much lower price point.Have u tried bastion,orcs must die?They are very good games and very affordable as well.Looking forward to Torchlight 2 @ 19.99$.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Are you even reading my post?

1. I already talked about Haswell's theoretical performance. If all goes well, it should be about triple the speed of Ivy Bridge's top part, i.e., about a GTS 250 or HD 4850. Many PC gamers still game on parts that old--this forum is not representative of PC gamers in general, so look at Steam's Hardware Survey for a more realistic assessment. Give it another couple of years and it's very possible that a mainstream Intel CPU will have ~HD6770 level performance, putting it on par with consoles if rumors are to be believed about their GPUs. (They don't have to go through Direct3D though so they will be faster than you might think.)

2. You seem to have completely missed my 2nd point. Dramatically increasing the number of people with access to PC hardware that can run games can only be good for PC gaming, imho. For reasons I stated in my post and for further reasons I explained to Jaydip up there. And here's another reason: it may encourage people who get a taste of PC gaming to then invest in a discrete GPU for Hybrid Crossfire or just by itself. So raising the floor may indirectly raise what is considered midrange, too.

3. Once again, a strong embedded GPU/APU/whatever you want to call it, can only help. It won't hurt. You won't even have to pay much extra for it since all the people NOT interested in gaming, will chip in. It's actually sort of a subsidy to gamers who wouldn't buy discrete cards. And if embedded GPUs can be used for things like hardware physics acceleration, then non-gamers will basically be subsidizing that for gamers, to a large degree.

DX9 is nearing it's 10th birthday. Who is to say that history won't repeat itself with DX11? By the time iGPUs have the power to run DX11 titles decently, shouldn't we be at DX15 or higher?



I believe the same was said about DX9 going into consoles around 2005/2006 when it was still a relatively new API. I remember looking at Bioshock on 360 and admiring how good it looked for a console game. Well, by 2007/2008 that awe washed away and now in 2012 I'm pretty much hating consoles haha. It's unfortunate that they are dictating what we as PC gamers get.

The other alternative is not one I want to support, where the hardware vendors put hardware specific tweaks into games to promote their product. I understand it's business, but I'd rather not dictate my hardware purchase decisions on which vendor bribed/supported/whatever which dev studio.

I don't think I have that positive outlook when over the course of the last 4 or 5 years, I've only seen some great devs shut their doors, and then this recent interview:

http://www.gamepolitics.com/2012/05...ers-describe-lucasarts-management-psychopaths

Suits are now in control of the games we play, and it seems they are more about cost cutting and money saving than they are graphics and technology.



That is something I'd support, but the question is will it happen? If it does, count me in. Now we just need the dev with brass balls and a silver tongue who can ok it with HQ to venture into unknown terrain that targets only a fraction of a fraction of the market.


I'm excited for COD : Black Ops 2 only because I really enjoy their SP campaigns, but man is that engine dated.

Edited to respond to the post you made while I was writing the above:

What you say is a worst-case scenario. In fact one can argue it's already happening--that it's the status quo. I think strong embedded GPUs can only help, not hurt, PC gaming. If it ends up doing absolutely nothing, then fine, at least it didn't hurt. And I do think it will help.


My only counter to this is the Nintendo Wii. It introduced gaming (console gaming) to a far wider audience. We can argue about the control scheme being the X-Factor, but in the end the Wii rose to the heavens just as fast as it fell from the sky.

For the real gamers (not the Angry Bird crowd) we got nothing out of it, just copy and paste games.

Where there is a bigger audience, there is more of a reason to cut corners and fake interest. Selling a crappy game to 10,000,000 people will probably get you more sales then selling an awesome game to 1,000 people.

And, I'm not saying we'll be getting worse games, just that the high end portion of it will be very lacking.

EDIT: Since I threw COD above, the quality (to me as a gamer, opinions will differ of course) of the games improved over the years. We went from no-name voice actors to a-list voice actors. The scripts/plots got a little better, sure make the Michael bay jokes, but in the graphics, engine, etc is relatively the same.

I don't think bigger budgets == better looking games, I've just seen them translate to more lavish marketing and more celebrity voice actors.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Not sure what the argument is about. Discrete graphics aren't dead and won't be for a while, but not all people care about gaming. People and system builders that were forced to include a discrete card will not any longer, because HD4000 performs well enough for 90% of the population who doesn't spend 500$ to improve gaming performance.

This should be obvious. Yes, iGPUs will cannibalize discrete sales for a LOT of people - discrete sales are signifigantly lower than they used to be in years past. The hardcore gamers will still buy discrete GPU's, so it won't die. Not hard to figure out.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I read your points clearly and responded to them clearly, and here we're both about to repeat ourselves...

Are you even reading my post?

1. I already talked about Haswell's theoretical performance. If all goes well, it should be about triple the speed of Ivy Bridge's top part, i.e., about a GTS 250 or HD 4850. Many PC gamers still game on parts that old--this forum is not representative of PC gamers in general, so look at Steam's Hardware Survey for a more realistic assessment. Give it another couple of years and it's very possible that a mainstream Intel CPU will have ~HD6770 level performance, putting it on par with consoles if rumors are to be believed about their GPUs. (They don't have to go through Direct3D though so they will be faster than you might think.)

What does this do for us, the high end? Which was my initial concern. This just expands the bottom and gives devs more of a reason to focus on the bottom versus the high end. I never question PC gaming's health, I question the high end's health.

2. You seem to have completely missed my 2nd point. Dramatically increasing the number of people with access to PC hardware that can run games can only be good for PC gaming, imho. For reasons I stated in my post and for further reasons I explained to Jaydip up there. And here's another reason: it may encourage people who get a taste of PC gaming to then invest in a discrete GPU for Hybrid Crossfire or just by itself. So raising the floor may indirectly raise what is considered midrange, too.

The issue with this reasoning is neither Intel nor AMD sponsor game development. IF the the bottom hardware is good enough (which is the basis of my argument) then there is no need to cater to the top portion. I think hybrid crossfire would be even more of a niche until the scaling is fixed - we've seen benches of a single card performing better than the iGPU+dGPU doing their song and dance.

3. Once again, a strong embedded GPU/APU/whatever you want to call it, can only help. It won't hurt. You won't even have to pay much extra for it since all the people NOT interested in gaming, will chip in. It's actually sort of a subsidy to gamers who wouldn't buy discrete cards. And if embedded GPUs can be used for things like hardware physics acceleration, then non-gamers will basically be subsidizing that for gamers, to a large degree.

Again, neither AMD nor Intel sponsor game development (well AMD to an extent.) What are they subsidizing? My next processor purchase? Intel and AMD literally the market with "good enough" does what to game sales? This isn't Microsoft/Nintendo/Sony we're talking about that sell their hardware at a loss to recoup in software sales, who then fund their own gaming IPs.

PC gaming is already far cheaper then console gaming, yet console games still dominate the sales chart. History may, and possibly will, repeat itself.

Edited to respond to the post you made while I was writing the above:

What you say is a worst-case scenario. In fact one can argue it's already happening--that it's the status quo. I think strong embedded GPUs can only help, not hurt, PC gaming. If it ends up doing absolutely nothing, then fine, at least it didn't hurt. And I do think it will help.

Again, it seems you missed my initial concern. I'm not claiming PC gaming itself is dead. I'm worried about the high end. So in this circle of repetition you didn't even acknowledge my initial point.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
How is it better? Much higher power consumption for a decent bump is GPU performance?
Power consumption is not as high as you trying to make us believe it is. In the end it is indeed a reasonable improvement over Intel HD4000, around the 50% mark.
I am not knocking Llano, just saying you pay for the better GPU with more power usage, and CPU performance is lacking.
CPU is def. lower but it's decent IMO. My Llano pc does everything you throw at it and then some. The GPU is petty good, even for some medium 1600x900 gaming.
PD is giving us marginally better GPU with less CPU performance for likely the same power usage. Thats not very encouraging.
How do you know all that? Less CPU performance with same power? Let's wait for the actual reviews please.
Definitely nothing to get excited about if you Llano today, and much less attractive than soon to available Celeron and i3 IB CPUs IMHO.
Tell me how that works if even the mighty IB can't beat Llano graphics? Celeron, really?
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I do acknowledge your myopic point. You are not looking at the big picture and keep wringing your hands over the high end without acknowledging how raising the floor can help PC gaming in general, including high-end GPU owners.

Look at the state of affairs today: desktops are a shrinking market and among those desktops few are capable of playing modern games at reasonable framerates if at all.

Fast forward several years, and your mainstream desktop CPU could well contain something on par with a HD 6770.

If you do not think that will help PC gaming, or if you think that it would encourage gamedevs to code for the low end more than they do now, I disagree.

Raising the floor can only HELP, not HURT, in most cases. It may raise what is considered midrange as well if people shell out for hybrid xfire or something or we get hardware accelerated physics for "free." And there are people who would never have gotten into PC gaming, but they might play a few games since they have "free" hardware that allows for it, and then they might upgrade to a discrete card, further raising the number of high-end-GPU-bearing desktops.

You are concluding that raising the floor will only encourage gamedevs to code for the lowest common denominator--but they are already doing that anyway! Dramatically raising the floor of desktop GPU power can only help, unless you have some sort of armageddon scenario where APUs raise the costs of discrete cards by tons, or puts NV out of business so there is no competition and AMD gouges us, etc.

Example: Blizz codes for the lowest common denominator. If the floor gets raised by enough, then they may spend more time on higher-end effects or at least DX11.

Another example: gamedev figures that with the explosion of game-worthy APUs, it can safely beta test and do QA for a narrower range of hardware, figuring that the absolute worst case is HD6770-level performance. It can spend more resources on optimizing for the high end rather than the lowest of the low end. It also expects many more sales of PC games because the number of desktop PCs capable of gaming has exploded, and some people will start buying PC games again or for the first time, whereas they wouldn't have before due to the cost of a gaming-grade discrete card, or for fear of opening up their desktop to install a video card (which believe it or not, still scares a lot of people, particularly if they have to also upgrade their PSU). So maybe gamedev spends more time polishing up the PC version of the game now that PC gaming is a resurgent market. And the extra profit may eventually feed into more resources for future game dev.

Then there are the other arguments I've already made about how raising the floor may indirectly help raise the ceiling as well, through increasing the size of the market and all that entails. (It's also about the overall size of the market in absolute terms. A simple example: scenario 1: 10 desktops, 5 of which have no worthwhile GPU, 3 have low end gaming GPU, 2 midrange, and 1 high end. scenario 2: 10 desktops, 5 of which have APUs that count as low-end gaming GPUs, 2 have low-end GPUs that can do hybrid xfire, 2 midrange, 1 high-end. The average gaming GPU spec just went down, but the total number of gaming-worthy PCs went up.)

Keep in mind PC gaming does not exist in a vacuum. There are many ways people can spend their entertainment dollars. If APUs/embedded GPUs can become the "gateway drug" that leads more people into hybrid xfire or high-end GPUs (and thus nudge gamedevs to code for better GPUs), good. In fact, even if it generates a single additional PC game sale, that is better than nothing and better than the status quo. If getting a "free" decent GPU with each desktop purchase means more money spent on PC games instead of console games or books or movies or whatever, that helps PC gaming and eventually high-end GPU owners as well. I already elaborated about some possible ways that may happen; there are other ways such sales can help the ecosystem, though.

If you disagree with the above, then let's just agree to disagree.


I read your points clearly and responded to them clearly, and here we're both about to repeat ourselves...



What does this do for us, the high end? Which was my initial concern. This just expands the bottom and gives devs more of a reason to focus on the bottom versus the high end. I never question PC gaming's health, I question the high end's health.



The issue with this reasoning is neither Intel nor AMD sponsor game development. IF the the bottom hardware is good enough (which is the basis of my argument) then there is no need to cater to the top portion. I think hybrid crossfire would be even more of a niche until the scaling is fixed - we've seen benches of a single card performing better than the iGPU+dGPU doing their song and dance.



Again, neither AMD nor Intel sponsor game development (well AMD to an extent.) What are they subsidizing? My next processor purchase? Intel and AMD literally the market with "good enough" does what to game sales? This isn't Microsoft/Nintendo/Sony we're talking about that sell their hardware at a loss to recoup in software sales, who then fund their own gaming IPs.

PC gaming is already far cheaper then console gaming, yet console games still dominate the sales chart. History may, and possibly will, repeat itself.



Again, it seems you missed my initial concern. I'm not claiming PC gaming itself is dead. I'm worried about the high end. So in this circle of repetition you didn't even acknowledge my initial point.
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I am not knocking Llano, just saying you pay for the better GPU with more power usage, and CPU performance is lacking. IB gave us 50% more GPU performance than SB with less power usage.

Actually according to anandtechs review, llano uses less power when gaming (20% to be exact), and that test was made with ivybridge set to balanced performance and llano to max performance, so if both were set to max performance (as was done during the gaming benchmarks) the gap would probably be even wider.

Off course llano uses significantly more power during browsing (54%), however it actually uses less power during H.264 playback (13% less), and during idle (arguably the most common scenario), llano only uses about 17% more.

So it's not really that cut and dried.

PD is giving us marginally better GPU with less CPU performance for likely the same power usage. Thats not very encouraging.

Actually AMD is claiming twice the performance/watt with trinity compared to llano:

http://www.pcworld.com/article/249267/amd_targets_ultrabooks_shows_18mm_trinity_notebook.html

Off course AMD isn't really the most reliable source, and it's also unclear which SKUs they are comparing (apparently 17W tdp Trinity to 35W llano)
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I do acknowledge your myopic point. You are not looking at the big picture and keep wringing your hands over the high end without acknowledging how raising the floor can help PC gaming in general, including high-end GPU owners.

Look at the state of affairs today: desktops are a shrinking market and among those desktops few are capable of playing modern games at reasonable framerates if at all.

Fast forward several years, and your mainstream desktop CPU could well contain something on par with a HD 6770.

If you do not think that will help PC gaming, or if you think that it would encourage gamedevs to code for the low end more than they do now, I disagree.

This is where I can't follow you, a few years from now do you think the HD 6770 would be capable of play games that at the time are modern? If the answer is yes, then stagnation has already occured. The iGPU's have been getting stronger over the years, and this time they gained a nice giant boost - however the software didn't.

Discrete hardware has improved and yet the top end is barely tapped into. It's always been this way, you're trying to argue to me that because in a few years the bottom would be an HD 6770 that it is now some how mid-range? If that is the future, then the high end is boned!

Raising the floor can only HELP, not HURT, in most cases. It may raise what is considered midrange as well if people shell out for hybrid xfire or something or we get hardware accelerated physics for "free." And there are people who would never have gotten into PC gaming, but they might play a few games since they have "free" hardware that allows for it, and then they might upgrade to a discrete card, further raising the number of high-end-GPU-bearing desktops.

You are concluding that raising the floor will only encourage gamedevs to code for the lowest common denominator--but they are already doing that anyway! Dramatically raising the floor of desktop GPU power can only help, unless you have some sort of armageddon scenario where APUs raise the costs of discrete cards by tons, or puts NV out of business so there is no competition and AMD gouges us, etc.

If you follow the industry, devs aren't moving out of their safety zone due to strict publisher requirements. Studios that sold 2million + copies of their game are shut down. This isn't how it was 10 or so years ago. They all want to be the next Halo or the next Call of Duty. Metro 2033 barely secured it's sequel, since it was one of the few games THQ made profits from.

Publishers are citing decreased software sales on piracy, some how expanding the user base is going to turn that around? If they continue their antics they'll shout it louder alienating us further.

The bridge between console and PC happened as the console PC software started to reflect PC software (Microsoft, you bastards) and as long as that one company is in charge of it I can continue to feel they are encouraging stagnation (Microsoft hasn't been very supportive of it's PC side - you know Games for Windows Live.) Increasing the user base can influence this, but will it cause them to deviate from their Xbox hardware? I'd wager it wouldn't.

Example: Blizz codes for the lowest common denominator. If the floor gets raised by enough, then they may spend more time on higher-end effects or at least DX11.

Another example: gamedev figures that with the explosion of game-worthy APUs, it can safely beta test and do QA for a narrower range of hardware, figuring that the absolute worst case is HD6770-level performance. It can spend more resources on optimizing for the high end rather than the lowest of the low end. It also expects many more sales of PC games because the number of desktop PCs capable of gaming has exploded, and some people will start buying PC games again or for the first time, whereas they wouldn't have before due to the cost of a gaming-grade discrete card, or for fear of opening up their desktop to install a video card (which believe it or not, still scares a lot of people, particularly if they have to also upgrade their PSU). So maybe gamedev spends more time polishing up the PC version of the game now that PC gaming is a resurgent market. And the extra profit may eventually feed into more resources for future game dev.

Again, have you been following the gaming industry? I've been noticing (and hearing) of a bad dev cycle:

Publisher contracts studio, studio hires employees, art team do preliminary work either fired or moved to other project, coders work on core product once finshed either fired or moved to other project, and a skeleton of the team remains to continue post launch support (or they themselves get fired if the product doesn't even make a dent.)

And these products are failing mostly on consoles, with poor PC sales with the usual piracy sited as the cause. But, we increase the user base magically publishers are going to pass on the saving to us (the gamers) by selling more copies and thus ending this cycle of cash grab? The more products they put out the better it is for them and only select few IPs even survive a few years with quality products or they just get whored out via the lowest common denominator system (which should sell plenty with a wider audience and will just create of the same.)

Then there are the other arguments I've already made about how raising the floor may indirectly help raise the ceiling as well, through increasing the size of the market and all that entails. (It's also about the overall size of the market in absolute terms. A simple example: scenario 1: 10 desktops, 5 of which have no worthwhile GPU, 3 have low end gaming GPU, 2 midrange, and 1 high end. scenario 2: 10 desktops, 5 of which have APUs that count as low-end gaming GPUs, 2 have low-end GPUs that can do hybrid xfire, 2 midrange, 1 high-end. The average gaming GPU spec just went down, but the total number of gaming-worthy PCs went up.)

Sorry, but I personally don't see hybrids becoming more than a niche. Even now it takes enthusiast to run hybrid systems. Casual users...I just don't see it. The ones that invest the time and research into it will probably find better solutions (ie 1 powerful discrete card.) That's just my opinion.

AMD's series has been out for a while, I've yet to run across a single hybrid solution, how about you?

Keep in mind PC gaming does not exist in a vacuum. There are many ways people can spend their entertainment dollars. If APUs/embedded GPUs can become the "gateway drug" that leads more people into hybrid xfire or high-end GPUs (and thus nudge gamedevs to code for better GPUs), good. In fact, even if it generates a single additional PC game sale, that is better than nothing and better than the status quo. If getting a "free" decent GPU with each desktop purchase means more money spent on PC games instead of console games or books or movies or whatever, that helps PC gaming and eventually high-end GPU owners as well. I already elaborated about some possible ways that may happen; there are other ways such sales can help the ecosystem, though.

If you disagree with the above, then let's just agree to disagree.

I used a Gateway drug example - the Wii. It broke records on sales, it also set made the two giants copy it. Which console had the lowest attach rate? The Nintendo Wii didn't do much for gaming as a whole. You can argue it's hardware was already limited, but in terms of even simple innovation - the console was riddled with shovelware games and copy and paste money grabs.

I already said at the start, I'm probably not as positive about you. Frankly put reading the industry news - I'm not very optimistic about what is around the corner from our remaining devs. Have you see CoD : Black Ops 2?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Thank you for your thoughtful comments. I think you are seeing causation where I see correlation. I think we have different forecasts and don't believe it is productive to discuss this further.
 

Stochastic

Member
Apr 1, 2012
51
0
61
Thank you for your thoughtful comments. I think you are seeing causation where I see correlation. I think we have different forecasts and don't believe it is productive to discuss this further.

An internet debate ending amicably? Very impressive *applauds*. I've read your and railven's posts and I have to say you guys have made some very insightful comments. It's always nice to see thoughtful discussions on forums.