Furthermore while 7950s and above have more than 2gig MOST cards are bought at a lower price point being 7870s and below. No developer is going to alienate the majority of their market.
In addition the target for TVs is 1080p. So super high res textures are not necessary.
Of course non of it matters because if the ps4 and 720 are as amazing as they are being made out to be.. we may all be console gamers.
Understand consoles never had x86 cores before, thus never needed to be optimized for such hierarchy. In short modern games will be written for consoles exactly the same way they are written for PC. The architecture is the same, the development platform is the same, the only thing really different is the API used in place of DirectX (other than the custom OS used for the console). Console ports will be extremely easy, and should in fact lessen game ports all together. I would expect more games to release for every platform all at once, since there will be no major code edits needed to "port" the game to other platforms. If a game is written to utilize eight cores, such as that of the PlayStation 4 and most likely the XBOX 720, you can expect it to utilize up to the same amount of resources for PC. As desktop eight core processors have already exist on the PC market for a little over a year now. That means you'll start seeing more and more games written like the core of Battlefield 3 was written. To utilize however many cores the OS recognizes. Thus why you see virtually no performance gains when using a i7-3770k vs using a FX-8350 in Battlefield 3. Meanwhile the later costing half the price. AMD's "moar cores" will pay off soon enough. As long as AMD doesn't add more cores to their existing FX line, I doubt you will see more cores added to Intel processors in the future. Both manufactures are still chasing IPC, working out the efficiency and performance of their cores before adding new ones (AMD had to resort to more cores to balance out the i7 market).The reasons more than 4 cores is rarely used in games is not because of consoles, but because what needs to be done does not lend itself to using more than 4 cores. It is very difficult to do something that is linear and make it work on multiple cores.
Understand consoles never had x86 cores before, thus never needed to be optimized for such hierarchy.
Snip:::
Keep in mind that will change soon as Steamroller introduces 15-30% more ops per clock. And Haswell has barely a bump in performance gain over Ivy Bridge. I predict AMD doing very well in the 2014 gaming market.
Battlefield 3 isn't really all that GPU bound, Frostbite is more CPU heavy than anything. The benchmarks posted above is done on two 7970's in crossfire, hence its a CPU bottleneck. As for games with threading, I think I already established current generation games are very few in support for over four cores. Tho future games like GTA V (if ever ported), Crysis 3, and Battlefield 4 should all support up to eight cores. AMD has a lot of room to establish themselves in the gaming market. If they can get their core performance up at least 15% from Piledriver, they may be some real competition to Intel in the near future. Intel will either have to build much faster cores (not going to gap them much from AMD) or add more cores. I doubt you will see six core Intel i5's as they would cost what an i7 costs now. If you view it from any angle, there is a lot of room for AMD to become the #1 best valued gaming processor manufacture on the market (like they have been in the past). Offering equivalent performance to Intel's best series i7 at literally half the cost. That would be their selling point, and for once they would be right back in the game. Lastly as a indie game developer, it's really not all that hard to optimize a game for multiple cores. Sure threading is hard because you have to share resources via pipes, memory, and other methods. Tho really game developers are lazy and try writing games in as few threads as possible. And you end up with games like SimCity 5 as the outcome, one of the worst games of the year.The reason that CPU chart is what it is, has nothing to do with cores, and everything to do with the game not being CPU bottlenecked. That chart clearly shows a GPU bound game.
I agree that you may see "more" games that thread well past 4 threads, but it will still remain a small amount of games.
The biggest issue remains. Game are linear by nature, and are very difficult to thread beyond what is done today.
Because if I wanted to play terrible console ports I would buy a PS4/Xbox whatever. But I don't so I don't.
I guess you're not familiar with functions such as SetThreadAffinityMask. Optimizing a game for multi-core processors is fairly straight forward. Getting the threads to sync with each other is the actual hard part. Also Bulldozer was produced in late 2011, which means eight core processors existed for just over a year now. Long enough for game developers to start taking notice to utilizing the available resources of these number crunching monsters. In fact, a lot of game performance doesn't come from the game code. It comes from the engine itself being optimized to work across multiple cores. If the engine knows how to efficiently allocate its workload across the available resources, the game shouldn't have a problem utilizing two cores, four cores, or even eight cores. For example how well Battlefield 3 runs on eight cores, and keeping up with Intel's best offering. Is not likely being due to EA game developers, but more over of how Frostbite 2 engine was tweaked in the CPU department. And as for the XBOX I did not know that, consoles are honestly at the very bottom of my interest list (consoles are terrible imo).Reads as a optimistic fairy tale. It's not as easy as adding a 'use 8' flag to make a game use more cores. That might not even be the model usage, maybe a core is always going to be running o/s background , network -work?
Also Bulldozer launched 11/2011 and was developed for years.
Finally the first Xbox used a Intel cpu.
Just because a benchmark uses top end GPU's does not automatically make it CPU bound, especially at 5760x1080.Battlefield 3 isn't really all that GPU bound, Frostbite is more CPU heavy than anything. The benchmarks posted above is done on two 7970's in crossfire, hence its a CPU bottleneck. As for games with threading, I think I already established current generation games are very few in support for over four cores. Tho future games like GTA V (if ever ported), Crysis 3, and Battlefield 4 should all support up to eight cores. AMD has a lot of room to establish themselves in the gaming market. If they can get their core performance up at least 15% from Piledriver, they may be some real competition to Intel in the near future. Intel will either have to build much faster cores (not going to gap them much from AMD) or add more cores. I doubt you will see six core Intel i5's as they would cost what an i7 costs now. If you view it from any angle, there is a lot of room for AMD to become the #1 best valued gaming processor manufacture on the market (like they have been in the past). Offering equivalent performance to Intel's best series i7 at literally half the cost. That would be their selling point, and for once they would be right back in the game. Lastly as a indie game developer, it's really not all that hard to optimize a game for multiple cores. Sure threading is hard because you have to share resources via pipes, memory, and other methods. Tho really game developers are lazy and try writing games in as few threads as possible. And you end up with games like SimCity 5 as the outcome, one of the worst games of the year.
Here is another screen from the same benchmark on much lower settings. A measly 1-4 FPS difference I personally don't call a GPU bottleneck.
![]()
I think you're misinterpreting the benchmark entirely. Battlefield 3 is a CPU bound game, we know that because of Frostbite. The frame rate cut off is not from a bottleneck, its due to reaching the full potential of the GPU's. But understand this, based within that same benchmark is results showing a FX-8350 was no slower than a i7-3770k clock for clock. It was able to maintain the same frame rate across the board, which means you wouldn't get no better performance by buying Intel (whole point of my previous post). If Battlefield 3 wasn't optimized to use eight cores, the FX-8350's scores would of most likely dropped off the chart. Or am I looking at a different chart entirely?Just because a benchmark uses top end GPU's does not automatically make it CPU bound, especially at 5760x1080.
BF3 can be CPU bound, but only in multiplayer or with lesser CPU's. With an overclock making no difference at all, in performance, it is clearly not CPU bound in this benchmark. There is 0 improvement when the i7 or the 8350 are overclocked. If it was CPU bound, it would show improvements when you OC the CPU. It doesn't, so it isn't.
I think you're misinterpreting the benchmark entirely. Battlefield 3 is a CPU bound game, we know that because of Frostbite. The frame rate cut off is not from a bottleneck, its due to reaching the full potential of the GPU's. But understand this, based within that same benchmark is results showing a FX-8350 was no slower than a i7-3770k clock for clock. It was able to maintain the same frame rate across the board, which means you wouldn't get no better performance by buying Intel (whole point of my previous post). If Battlefield 3 wasn't optimized to use eight cores, the FX-8350's scores would of most likely dropped off the chart. Or am I looking at a different chart entirely?
Look you don't have to get all mad over being wrong. It would be a CPU bottleneck if that was what is holding back the benchmarks. In fact it's neither, when reaching full capacity on a piece of hardware that's not called "bottlenecking". I might of said it was a CPU bottleneck, but that was only to put your GPU bottleneck theory to rest (which is more wrong than the two). A bottleneck means another piece of hardware is holding the potential of another piece of hardware back. I made that very clear in my previous posts. All games can be bottlenecked, but there is more to it than just hardware. The graph shows stock clocks, and both CPU's set to 4.4GHz. The gains for the boost are only marginal, meaning the GPU's can't do no more with what either CPU has to offer.You don't know crap obviously.
If the GPU's reach their full potential, as you said, that means the CPU is not bottlenecking the GPU's. A bottleneck means the GPU's can't reach their full potential. That is the meaning of a bottleneck. If the GPU's reach full potential, the CPU by definition is not a bottleneck.
BF3 can be bottlenecked on some maps or in different situations, but it was clearly NOT bottlenecked in that benchmark. If it had been, the performance would increase with a CPU overclock, as both CPU's were tested at stock and at 4.4Ghz, and there was absolutely no improvement, which clearly means there is no CPU bottleneck.
If the GPU's reach their full potential, as you said, that means the CPU is not bottlenecking the GPU's.
All this means is AMD's CPU market will have a much better edge in future gaming. The PlayStation 4 has eight Jaguar cores, which means every single game will be optimized for eight cores. When it comes time to port the game over to PC, AMD's FX series is going to start looking really good. A GPU is a GPU there is no threading and such when it comes to game development. You tell it what to draw, and it does as its told. How well it performs is dependent on the hardware itself and the drivers communicating with it. Tho optimizing games for multiple CPU cores is an option, being a console there is no reason for developers to not utilize all of the available resources. So really the question isn't "why would you buy next gen Nvidia?" but rather "why would you buy next gen Intel?". All of the "buy the i5 its better for gaming bro" will be a thing of the past. Four fast cores cannot beat eight slightly slower cores when the application is made to utilize up to eight cores. The next gen XBOX is going to have a similar APU as well, most likely with the same core count for cross console compatibility among game developers (easier to expect same game performance without rewriting the core of the game). AMD deserves at least to be crowned king of gaming when it comes to their CPU's (like old times). I think most people will start to see as software advances, Intel's whole IPC claims become more moot. Existing AMD FX series already crushes Intel i7 series in thread heavy workloads (the future always has been thread dependent). I don't really wanna make this a whole Intel vs AMD debate, but what i'm saying is that the only place any of these hardware manufactures are going to benefit from what console's have adopted. Is how games are going to be heavily threaded, and optimized for eight cores.
You do know its standard for us developers to utilized every resource available on a console. For the fact every other unit runs the same exact hardware. I don't see Intel offering up 3960x's for $300 any time soon either.You really think intel won't respond by making processors with more cores? Seriously?
Lets be real, just like with Xbox 360 and PS3, developers will not be using anywhere near the full potential of the new PS4 or 720. They won't start developing anything using all 8 threads for awhile. By the time they do, intel will have 6-8 core CPUs that are more than affordable.
It's all speculation though. Maybe AMD might hold a small advantage for a couple of months, but I'm HIGHLY doubting intel won't have an offering ready for this when programmers start utilizing more threads. Hell, they already have very affordable 6 core processors. Don't get me wrong, I'm no intel fanboy, I buy the best product out at the time, but I really don't see intel dropping the ball again like they did with the pentium 4. This is a NEVER AGAIN, for them.
My OPINION, by the time PS4 an Xbox 720 start utilizing 8 threads, intel will be well beyond that point.
On a side note: Aren't we due for some Conroe type performance jump in the CPU or GPU industry? Seriously, help us out AMD/Intel/Nvidia....
Seeing as I upgrade my GPU annually, if required, Nvidia. Better drivers, and this gen, Kepler runs cooler and uses less power. As for the eternal AMD vs Intel/AMD vs Nvidia/AMD vs the world debates, forget about average FPS and what about frame latencies? Doesn't the Intel and Nvidia combo currently offer not only better but *smoother* FPS comparatively?
With the Xbox reveal just a few days away, it has been confirmed that AMD will be powering all three next gen consoles. Microsoft and Sony have stated in the past that these next gen consoles are being developed for the next 10 years of gaming. That means for 10 years, developers making exclusive or cross platform titles will be coding for AMD hardware first, everyone else second.
With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?
Nvidia claimed they weren't interested in a PS4 design win due to slim margins, but isn't the opportunity cost greater than "slim margins" with AMD now having a monopoly?
What are your thoughts?
Disclaimer - I'm just a gamer. I'm not a programmer or developer and clearly ignorant to how a game goes from conception to market. I'm only inquiring.
You do know its standard for us developers to utilized every resource available on a console. For the fact every other unit runs the same exact hardware. I don't see Intel offering up 3960x's for $300 any time soon either.
