3DVagabond
Lifer
- Aug 10, 2009
- 11,951
- 204
- 106
IMO, If it helps AMD in any way it will be on their CPU's.
Because Intel is incapable of making eight core consumer CPUs?
Beyond having an x86 CPU and a GPU based on GCN, the new Xbox is even rumored to be running a lite version of Windows 8.
I don't think future ports will be as terrible as your implying with this next generation of consoles.
the consoles are not dealing with PC DX and Drivers...
I'm actually kind of hoping that developers will start to abandon DX for OpenGL. Less reliance on Windows is something we should want now that it seems Phaëton is driving Microsoft...
Look you don't have to get all mad over being wrong. It would be a CPU bottleneck if that was what is holding back the benchmarks. In fact it's neither, when reaching full capacity on a piece of hardware that's not called "bottlenecking". I might of said it was a CPU bottleneck, but that was only to put your GPU bottleneck theory to rest (which is more wrong than the two). A bottleneck means another piece of hardware is holding the potential of another piece of hardware back. I made that very clear in my previous posts. All games can be bottlenecked, but there is more to it than just hardware. The graph shows stock clocks, and both CPU's set to 4.4GHz. The gains for the boost are only marginal, meaning the GPU's can't do no more with what either CPU has to offer.
Notice it notes there is NO CPU bottleneck. Also note that it mentions that BF3 has no issues in singleplayer in this regard. The only CPU bottlenecks with BF3 are in multiplayer, something your chart is not showing.Historically, we’ve seen AMD’s processors bottleneck the performance of certain games at low resolutions and mainstream quality settings. Using a GeForce [FONT=inherit !important][FONT=inherit ! important]GTX[/FONT][/FONT] 680 at Battlefield 3’s Ultra quality preset, however, reveals no such limitation (even with anti-aliasing disabled completely).
Of course, this only applies to the single-player campaign, which tends to be GPU-heavy. The multi-player element of Battlefield 3 is more taxing on [FONT=inherit !important][FONT=inherit ! important]processor[/FONT][/FONT]. But because it’s difficult to create a repeatable benchmark involving 63 other players, we’ll move on to another game notorious for its emphasis on CPU speed.
http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407-5.htmlLooking back at my notes for the [FONT=inherit !important][FONT=inherit !important]Bulldozer[/FONT][/FONT] launch (AMD Bulldozer Review: FX-8150 Gets Tested), AMD was very enthusiastic about FX’s performance in Battlefield 3 (multiplayer beta, at the time). And no wonder—Battlefield 3's single-player campaign doesn’t care if you’re using a $130 Core i3 or $315 Core i7. It doesn’t care if you come armed with two Hyper-Threaded cores or four Bulldozer modules. It just. Doesn’t. Care.
In fact, after getting a little overzealous swapping out Lynnfield-, Clarkdale-, and [FONT=inherit !important][FONT=inherit ! important]Sandy[/FONT][/FONT]-based chips, I tried one AMD CPU and decided to call it a day. Any reasonably-modern processor is going to be held back by graphics long before hamstringing performance itself.
They all say the same thing over and over. In single player BF3, the games are GPU bound and it doesn't matter what CPU you use. Heck, one of those even shows that even 2 cores can equal 6 cores as well.AMD’s lower-cost FX-8350 continues to maintain performance parity in Battlefield 3, even as our highest resolution and detail settings lean hard against a pair of Radeon HD 7970s.
Because nVidia provides the solutions to my requirements just a little better and closer than AMD does.With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?
With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?
According to this article (http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall), Killzone for PS4 uses 3GB of its memory for its graphical components. My take on this is that if a first gen game for PS4 already uses that much memory for 1080P, anybody that has 1440P/1600P will need at least 6GB or more to be 'future proof'.
AMD will likely have a new architecture out by the time dev's start really taking advantage of the PS4's hardware.
Nope, the 7970ghz is faster and smoother for the majority of games, while also being cheaper than the 680. Once AMD patched up the frame latencies, Pcper pretty much started to ignore the single GPU comparisions, because they made the 680 look even worse. The power difference is also only around 20-30w under load.
The fallacy in the original post is that game developers will automatically focus most of their resources on dedicated game consoles powered by AMD graphics. The trend moving forward is that game developers will increasingly devote more and more resources towards mobile gaming on open platforms, and game developers will try to take advantage of these huge ecosystems by providing free-to-play games or low cost games that reach a huge audience rather than a restricted audience. And for those game developers who want to push forward bleeding edge graphics quality, PC gaming on open platforms will generally be the preferred choice. Within the next 1-3 years, PC graphics technology will be well ahead of what is possible in Playstation Next or Xbox Next.
I dont know about that,video card development has slowed to a snails pace. The past 2 years have showed virtually no massive increase in graphics power. Moore's law for the desktop is dead. Mobile is where all the emphasis is.
Console GPU purchases have nothing to do with PC purchases. Even if AMD gets a influx of money for this, if anything it will hurt the PC portion of GPU business since they will be dedicated to pumping out for console units.
I hate to say this, but consoles are terrible for PC business, it pools to much resources away from tech advances on PC front. All have seen the last few years is GPUs just getting slightly faster on slightly better hardware. Nothing that leaps and bounds blows stuff out of the water like early advances did.
