So in 10 years an integrated GPU will be capable of playing the latest games at a minimum of 1080P minimum 60FPS on at least high?
LMAO i doubt they will be able to runBF3 on those settings at that performance level. Considering resolutions are only going to go up and pretty much everything will be 120hz+ by then I find it even less likely.
By the time intel HDn000 will offer the same experience as the lower tier NV,AMD chips available today,NV and AMD's solutions will make intel look like crap yet again.Here i assume PC gaming won't be dead before that of course.
However, as others have said, this won't necessarily mean that the core dedicated GPU market will wither away immediately. Discrete GPUs may become more of a niche, but so long as that niche remains profitable then it won't necessarily go away. I do think that, eventually, discrete GPUs will go the way of sound cards, but that is still probably some time off.
++:thumbsup:I think my fear isn't that PC gaming would be dead, but high end gaming would be, if not dead, in bad shape.
If the APUs get to a point where they can run console code (and look at the rumors of next gen console hardware) then the devs (who have already shown to cave and cater to to the Lowest Common Denominator) will just go the shovelware route. Hell, porting might be easier due to so many similarities in the hardware.
I tell you what, having to wait almost a year for the DX11 Crysis 2 patch was an eye opener. We, the high end gaming elitist, don't matter much anymore. We sit here in forums and argue with each other about how awesome our team is - yet in the end Angry Birds out sold our best looking game by a margin of a trillion to 1 (exagerrated to make my point.) Next year Angry Bird's 3D will run on an iPad4, and so will Crysis 4.
I think my fear isn't that PC gaming would be dead, but high end gaming would be, if not dead, in bad shape.
If the APUs get to a point where they can run console code (and look at the rumors of next gen console hardware) then the devs (who have already shown to cave and cater to to the Lowest Common Denominator) will just go the shovelware route. Hell, porting might be easier due to so many similarities in the hardware.
I tell you what, having to wait almost a year for the DX11 Crysis 2 patch was an eye opener. We, the high end gaming elitist, don't matter much anymore. We sit here in forums and argue with each other about how awesome our team is - yet in the end Angry Birds out sold our best looking game by a margin of a trillion to 1 (exagerrated to make my point.) Next year Angry Bird's 3D will run on an iPad4, and so will Crysis 4.
Has the HD4000 caught up with 8800GTX yet?
Maybe I'm just optimistic, but I think stronger non-discrete GPUs are a GOOD thing. If Haswell's top part is as fast in practice as in theory, it'll be about as fast as a GTS 250 or HD4850 or so. Fast forward another couple of years, and even the mainstream Intel CPUs will probably have that level of performance.
PC games are quite affordable already.^^I agree with blastingcap. PC gaming only stands to benefit from more people being able to play PC games affordably.
Maybe I'm just optimistic, but I think stronger non-discrete GPUs are a GOOD thing. If Haswell's top part is as fast in practice as in theory, it'll be about as fast as a GTS 250 or HD4850 or so. Fast forward another couple of years, and even the mainstream Intel CPUs will probably have that level of performance.
Why is this is a good thing?
1. It raises the floor. If everyone has access to DX11 then maybe more gamedevs will use stuff like tessellation, rather than continue to plod along on DX9.
2. It dramatically increases the potential market size for graphically intense PC games, if everybody is forced to buy a decent GPU as part of a CPU. The percentage of computers able to run a modern game acceptably well is probably low right now, but imagine a situation where that number increases by a factor of 10. All of a sudden, the market size for graphically intense PC games increases by up to a factor of 10. For gamedevs, that could lead to more sales, more revenue, more profit, and thus more money to reinvest into making great games.
Yes it could also lead to stagnation if gamedevs figure they'll code for a certain level of graphics, namely, whatever consoles and APUs are capable of. But how is that any worse than the situation today? It can only get better, not worse, if strong APUs are commonplace.
3. Embedded GPUs may be repurposed, e.g., to drive hardware-accelerated physics.
PC games are quite affordable already.
i3 2100 is a dual core cpu with HT and it's an excellent gaming cpu. Your excuse to qualify negatively a unreleased chip is getting old. They could surprise us.
On the topic, i think that both Nvidia and AMD would suffer greatly if intel released a better still graphics solution. In other words HD 4000 while very good for today's standard is not good enough. Older tech (Llano) is still better.
Are you purposely misinterpreting all of this?
If the masses are roped into buying GPUs whether they want to or not, that should spread out the costs and make PC gaming hardware more affordable.
But to your point:
If the size of the PC gaming market increases, then perhaps gamedevs will pay more attention to it. And if even a few people who would otherwise NOT buy PC games, start buying PC games because they got a "free" GPU, then that means more revenue and profit for PC game developers, which one would hope would lead to a bigger budget for the next game, or lower its price, or whatever, if competition pressures them into doing so.
I didn't misinterpret anything.Now u don't have to pay 60$ to get AAA experience ,u can enjoy quality games at a much lower price point.Have u tried bastion,orcs must die?They are very good games and very affordable as well.Looking forward to Torchlight 2 @ 19.99$.Are you purposely misinterpreting all of this?
If the masses are roped into buying GPUs whether they want to or not, that should spread out the costs and make PC gaming hardware more affordable.
But to your point:
If the size of the PC gaming market increases, then perhaps gamedevs will pay more attention to it. And if even a few people who would otherwise NOT buy PC games, start buying PC games because they got a "free" GPU, then that means more revenue and profit for PC game developers, which one would hope would lead to a bigger budget for the next game, or lower its price, or whatever, if competition pressures them into doing so.
DX9 is nearing it's 10th birthday. Who is to say that history won't repeat itself with DX11? By the time iGPUs have the power to run DX11 titles decently, shouldn't we be at DX15 or higher?
I believe the same was said about DX9 going into consoles around 2005/2006 when it was still a relatively new API. I remember looking at Bioshock on 360 and admiring how good it looked for a console game. Well, by 2007/2008 that awe washed away and now in 2012 I'm pretty much hating consoles haha. It's unfortunate that they are dictating what we as PC gamers get.
The other alternative is not one I want to support, where the hardware vendors put hardware specific tweaks into games to promote their product. I understand it's business, but I'd rather not dictate my hardware purchase decisions on which vendor bribed/supported/whatever which dev studio.
I don't think I have that positive outlook when over the course of the last 4 or 5 years, I've only seen some great devs shut their doors, and then this recent interview:
http://www.gamepolitics.com/2012/05...ers-describe-lucasarts-management-psychopaths
Suits are now in control of the games we play, and it seems they are more about cost cutting and money saving than they are graphics and technology.
That is something I'd support, but the question is will it happen? If it does, count me in. Now we just need the dev with brass balls and a silver tongue who can ok it with HQ to venture into unknown terrain that targets only a fraction of a fraction of the market.
I'm excited for COD : Black Ops 2 only because I really enjoy their SP campaigns, but man is that engine dated.
My only counter to this is the Nintendo Wii. It introduced gaming (console gaming) to a far wider audience. We can argue about the control scheme being the X-Factor, but in the end the Wii rose to the heavens just as fast as it fell from the sky.
For the real gamers (not the Angry Bird crowd) we got nothing out of it, just copy and paste games.
Where there is a bigger audience, there is more of a reason to cut corners and fake interest. Selling a crappy game to 10,000,000 people will probably get you more sales then selling an awesome game to 1,000 people.
And, I'm not saying we'll be getting worse games, just that the high end portion of it will be very lacking.
EDIT: Since I threw COD above, the quality (to me as a gamer, opinions will differ of course) of the games improved over the years. We went from no-name voice actors to a-list voice actors. The scripts/plots got a little better, sure make the Michael bay jokes, but in the graphics, engine, etc is relatively the same.
I don't think bigger budgets == better looking games, I've just seen them translate to more lavish marketing and more celebrity voice actors.
Are you even reading my post?
1. I already talked about Haswell's theoretical performance. If all goes well, it should be about triple the speed of Ivy Bridge's top part, i.e., about a GTS 250 or HD 4850. Many PC gamers still game on parts that old--this forum is not representative of PC gamers in general, so look at Steam's Hardware Survey for a more realistic assessment. Give it another couple of years and it's very possible that a mainstream Intel CPU will have ~HD6770 level performance, putting it on par with consoles if rumors are to be believed about their GPUs. (They don't have to go through Direct3D though so they will be faster than you might think.)
2. You seem to have completely missed my 2nd point. Dramatically increasing the number of people with access to PC hardware that can run games can only be good for PC gaming, imho. For reasons I stated in my post and for further reasons I explained to Jaydip up there. And here's another reason: it may encourage people who get a taste of PC gaming to then invest in a discrete GPU for Hybrid Crossfire or just by itself. So raising the floor may indirectly raise what is considered midrange, too.
3. Once again, a strong embedded GPU/APU/whatever you want to call it, can only help. It won't hurt. You won't even have to pay much extra for it since all the people NOT interested in gaming, will chip in. It's actually sort of a subsidy to gamers who wouldn't buy discrete cards. And if embedded GPUs can be used for things like hardware physics acceleration, then non-gamers will basically be subsidizing that for gamers, to a large degree.
Edited to respond to the post you made while I was writing the above:
What you say is a worst-case scenario. In fact one can argue it's already happening--that it's the status quo. I think strong embedded GPUs can only help, not hurt, PC gaming. If it ends up doing absolutely nothing, then fine, at least it didn't hurt. And I do think it will help.
Power consumption is not as high as you trying to make us believe it is. In the end it is indeed a reasonable improvement over Intel HD4000, around the 50% mark.How is it better? Much higher power consumption for a decent bump is GPU performance?
CPU is def. lower but it's decent IMO. My Llano pc does everything you throw at it and then some. The GPU is petty good, even for some medium 1600x900 gaming.I am not knocking Llano, just saying you pay for the better GPU with more power usage, and CPU performance is lacking.
How do you know all that? Less CPU performance with same power? Let's wait for the actual reviews please.PD is giving us marginally better GPU with less CPU performance for likely the same power usage. Thats not very encouraging.
Tell me how that works if even the mighty IB can't beat Llano graphics? Celeron, really?Definitely nothing to get excited about if you Llano today, and much less attractive than soon to available Celeron and i3 IB CPUs IMHO.
I read your points clearly and responded to them clearly, and here we're both about to repeat ourselves...
What does this do for us, the high end? Which was my initial concern. This just expands the bottom and gives devs more of a reason to focus on the bottom versus the high end. I never question PC gaming's health, I question the high end's health.
The issue with this reasoning is neither Intel nor AMD sponsor game development. IF the the bottom hardware is good enough (which is the basis of my argument) then there is no need to cater to the top portion. I think hybrid crossfire would be even more of a niche until the scaling is fixed - we've seen benches of a single card performing better than the iGPU+dGPU doing their song and dance.
Again, neither AMD nor Intel sponsor game development (well AMD to an extent.) What are they subsidizing? My next processor purchase? Intel and AMD literally the market with "good enough" does what to game sales? This isn't Microsoft/Nintendo/Sony we're talking about that sell their hardware at a loss to recoup in software sales, who then fund their own gaming IPs.
PC gaming is already far cheaper then console gaming, yet console games still dominate the sales chart. History may, and possibly will, repeat itself.
Again, it seems you missed my initial concern. I'm not claiming PC gaming itself is dead. I'm worried about the high end. So in this circle of repetition you didn't even acknowledge my initial point.
I am not knocking Llano, just saying you pay for the better GPU with more power usage, and CPU performance is lacking. IB gave us 50% more GPU performance than SB with less power usage.
PD is giving us marginally better GPU with less CPU performance for likely the same power usage. Thats not very encouraging.
I do acknowledge your myopic point. You are not looking at the big picture and keep wringing your hands over the high end without acknowledging how raising the floor can help PC gaming in general, including high-end GPU owners.
Look at the state of affairs today: desktops are a shrinking market and among those desktops few are capable of playing modern games at reasonable framerates if at all.
Fast forward several years, and your mainstream desktop CPU could well contain something on par with a HD 6770.
If you do not think that will help PC gaming, or if you think that it would encourage gamedevs to code for the low end more than they do now, I disagree.
Raising the floor can only HELP, not HURT, in most cases. It may raise what is considered midrange as well if people shell out for hybrid xfire or something or we get hardware accelerated physics for "free." And there are people who would never have gotten into PC gaming, but they might play a few games since they have "free" hardware that allows for it, and then they might upgrade to a discrete card, further raising the number of high-end-GPU-bearing desktops.
You are concluding that raising the floor will only encourage gamedevs to code for the lowest common denominator--but they are already doing that anyway! Dramatically raising the floor of desktop GPU power can only help, unless you have some sort of armageddon scenario where APUs raise the costs of discrete cards by tons, or puts NV out of business so there is no competition and AMD gouges us, etc.
Example: Blizz codes for the lowest common denominator. If the floor gets raised by enough, then they may spend more time on higher-end effects or at least DX11.
Another example: gamedev figures that with the explosion of game-worthy APUs, it can safely beta test and do QA for a narrower range of hardware, figuring that the absolute worst case is HD6770-level performance. It can spend more resources on optimizing for the high end rather than the lowest of the low end. It also expects many more sales of PC games because the number of desktop PCs capable of gaming has exploded, and some people will start buying PC games again or for the first time, whereas they wouldn't have before due to the cost of a gaming-grade discrete card, or for fear of opening up their desktop to install a video card (which believe it or not, still scares a lot of people, particularly if they have to also upgrade their PSU). So maybe gamedev spends more time polishing up the PC version of the game now that PC gaming is a resurgent market. And the extra profit may eventually feed into more resources for future game dev.
Then there are the other arguments I've already made about how raising the floor may indirectly help raise the ceiling as well, through increasing the size of the market and all that entails. (It's also about the overall size of the market in absolute terms. A simple example: scenario 1: 10 desktops, 5 of which have no worthwhile GPU, 3 have low end gaming GPU, 2 midrange, and 1 high end. scenario 2: 10 desktops, 5 of which have APUs that count as low-end gaming GPUs, 2 have low-end GPUs that can do hybrid xfire, 2 midrange, 1 high-end. The average gaming GPU spec just went down, but the total number of gaming-worthy PCs went up.)
Keep in mind PC gaming does not exist in a vacuum. There are many ways people can spend their entertainment dollars. If APUs/embedded GPUs can become the "gateway drug" that leads more people into hybrid xfire or high-end GPUs (and thus nudge gamedevs to code for better GPUs), good. In fact, even if it generates a single additional PC game sale, that is better than nothing and better than the status quo. If getting a "free" decent GPU with each desktop purchase means more money spent on PC games instead of console games or books or movies or whatever, that helps PC gaming and eventually high-end GPU owners as well. I already elaborated about some possible ways that may happen; there are other ways such sales can help the ecosystem, though.
If you disagree with the above, then let's just agree to disagree.
Thank you for your thoughtful comments. I think you are seeing causation where I see correlation. I think we have different forecasts and don't believe it is productive to discuss this further.
