Why would you buy next gen Nvidia?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Xarick

Golden Member
May 17, 2006
1,199
1
76
Of course non of it matters because if the ps4 and 720 are as amazing as they are being made out to be.. we may all be console gamers.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Furthermore while 7950s and above have more than 2gig MOST cards are bought at a lower price point being 7870s and below. No developer is going to alienate the majority of their market.
In addition the target for TVs is 1080p. So super high res textures are not necessary.

Why would the developer using high-resolution textures alienate anyone? The beautiful thing about PC gaming is that you (usually) have a bevy of graphical options to tailor the game to your system. You don't have enough video memory to use those fancy textures, then turn the texture quality setting down.

Some people may find these options daunting, because they're pretty much only used to volume and gamma sliders. :p All joking aside, they don't feel like learning about them, so that's most likely where NVIDIA's GeForce Experience came from.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Of course non of it matters because if the ps4 and 720 are as amazing as they are being made out to be.. we may all be console gamers.

Erm,nothing against consoles dear chap,just not my thang:).
 
Jan 31, 2013
108
0
0
All this means is AMD's CPU market will have a much better edge in future gaming. The PlayStation 4 has eight Jaguar cores, which means every single game will be optimized for eight cores. When it comes time to port the game over to PC, AMD's FX series is going to start looking really good. A GPU is a GPU there is no threading and such when it comes to game development. You tell it what to draw, and it does as its told. How well it performs is dependent on the hardware itself and the drivers communicating with it. Tho optimizing games for multiple CPU cores is an option, being a console there is no reason for developers to not utilize all of the available resources. So really the question isn't "why would you buy next gen Nvidia?" but rather "why would you buy next gen Intel?". All of the "buy the i5 its better for gaming bro" will be a thing of the past. Four fast cores cannot beat eight slightly slower cores when the application is made to utilize up to eight cores. The next gen XBOX is going to have a similar APU as well, most likely with the same core count for cross console compatibility among game developers (easier to expect same game performance without rewriting the core of the game). AMD deserves at least to be crowned king of gaming when it comes to their CPU's (like old times). I think most people will start to see as software advances, Intel's whole IPC claims become more moot. Existing AMD FX series already crushes Intel i7 series in thread heavy workloads (the future always has been thread dependent). I don't really wanna make this a whole Intel vs AMD debate, but what i'm saying is that the only place any of these hardware manufactures are going to benefit from what console's have adopted. Is how games are going to be heavily threaded, and optimized for eight cores.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The reasons more than 4 cores is rarely used in games is not because of consoles, but because what needs to be done does not lend itself to using more than 4 cores. It is very difficult to do something that is linear and make it work on multiple cores.
 

lagokc

Senior member
Mar 27, 2013
808
1
41
Because if I wanted to play terrible console ports I would buy a PS4/Xbox whatever. But I don't so I don't.

90% of the games I play are PC exclusive because RTS, good FPS, and good RPG simply do not work on consoles.
 
Jan 31, 2013
108
0
0
The reasons more than 4 cores is rarely used in games is not because of consoles, but because what needs to be done does not lend itself to using more than 4 cores. It is very difficult to do something that is linear and make it work on multiple cores.
Understand consoles never had x86 cores before, thus never needed to be optimized for such hierarchy. In short modern games will be written for consoles exactly the same way they are written for PC. The architecture is the same, the development platform is the same, the only thing really different is the API used in place of DirectX (other than the custom OS used for the console). Console ports will be extremely easy, and should in fact lessen game ports all together. I would expect more games to release for every platform all at once, since there will be no major code edits needed to "port" the game to other platforms. If a game is written to utilize eight cores, such as that of the PlayStation 4 and most likely the XBOX 720, you can expect it to utilize up to the same amount of resources for PC. As desktop eight core processors have already exist on the PC market for a little over a year now. That means you'll start seeing more and more games written like the core of Battlefield 3 was written. To utilize however many cores the OS recognizes. Thus why you see virtually no performance gains when using a i7-3770k vs using a FX-8350 in Battlefield 3. Meanwhile the later costing half the price. AMD's "moar cores" will pay off soon enough. As long as AMD doesn't add more cores to their existing FX line, I doubt you will see more cores added to Intel processors in the future. Both manufactures are still chasing IPC, working out the efficiency and performance of their cores before adding new ones (AMD had to resort to more cores to balance out the i7 market).

BF3UltraCPUBottleneck2013.png


Keep in mind that will change soon as Steamroller introduces 15-30% more ops per clock. And Haswell has barely a bump in performance gain over Ivy Bridge. I predict AMD doing very well in the 2014 gaming market.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The reason that CPU chart is what it is, has nothing to do with cores, and everything to do with the game not being CPU bottlenecked. That chart clearly shows a GPU bound game. If the CPU performance mattered, you'd see differences when the CPU's were OC'ed.

I agree that you may see "more" games that are thread well past 4 threads, but it will still remain a small amount of games.

The biggest issue remains. Game are linear by nature, and are very difficult to thread beyond what is done today.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Understand consoles never had x86 cores before, thus never needed to be optimized for such hierarchy.
Snip:::



Keep in mind that will change soon as Steamroller introduces 15-30% more ops per clock. And Haswell has barely a bump in performance gain over Ivy Bridge. I predict AMD doing very well in the 2014 gaming market.

Reads as a optimistic fairy tale. It's not as easy as adding a 'use 8' flag to make a game use more cores. That might not even be the model usage, maybe a core is always going to be running o/s background , network -work?
Also Bulldozer launched 11/2011 and was developed for years.

Finally the first Xbox used a Intel cpu.
 
Jan 31, 2013
108
0
0
The reason that CPU chart is what it is, has nothing to do with cores, and everything to do with the game not being CPU bottlenecked. That chart clearly shows a GPU bound game.

I agree that you may see "more" games that thread well past 4 threads, but it will still remain a small amount of games.

The biggest issue remains. Game are linear by nature, and are very difficult to thread beyond what is done today.
Battlefield 3 isn't really all that GPU bound, Frostbite is more CPU heavy than anything. The benchmarks posted above is done on two 7970's in crossfire, hence its a CPU bottleneck. As for games with threading, I think I already established current generation games are very few in support for over four cores. Tho future games like GTA V (if ever ported), Crysis 3, and Battlefield 4 should all support up to eight cores. AMD has a lot of room to establish themselves in the gaming market. If they can get their core performance up at least 15% from Piledriver, they may be some real competition to Intel in the near future. Intel will either have to build much faster cores (not going to gap them much from AMD) or add more cores. I doubt you will see six core Intel i5's as they would cost what an i7 costs now. If you view it from any angle, there is a lot of room for AMD to become the #1 best valued gaming processor manufacture on the market (like they have been in the past). Offering equivalent performance to Intel's best series i7 at literally half the cost. That would be their selling point, and for once they would be right back in the game. Lastly as a indie game developer, it's really not all that hard to optimize a game for multiple cores. Sure threading is hard because you have to share resources via pipes, memory, and other methods. Tho really game developers are lazy and try writing games in as few threads as possible. And you end up with games like SimCity 5 as the outcome, one of the worst games of the year.

Here is another screen from the same benchmark on much lower settings. A measly 1-4 FPS difference I personally don't call a GPU bottleneck.
BF3MediumCPUBottleneck2013.png
 

Wildman107

Member
Apr 8, 2013
46
0
0
Because if I wanted to play terrible console ports I would buy a PS4/Xbox whatever. But I don't so I don't.

Beyond having an x86 CPU and a GPU based on GCN, the new Xbox is even rumored to be running a lite version of Windows 8.

I don't think future ports will be as terrible as your implying with this next generation of consoles.
 
Last edited:
Jan 31, 2013
108
0
0
Reads as a optimistic fairy tale. It's not as easy as adding a 'use 8' flag to make a game use more cores. That might not even be the model usage, maybe a core is always going to be running o/s background , network -work?
Also Bulldozer launched 11/2011 and was developed for years.

Finally the first Xbox used a Intel cpu.
I guess you're not familiar with functions such as SetThreadAffinityMask. Optimizing a game for multi-core processors is fairly straight forward. Getting the threads to sync with each other is the actual hard part. Also Bulldozer was produced in late 2011, which means eight core processors existed for just over a year now. Long enough for game developers to start taking notice to utilizing the available resources of these number crunching monsters. In fact, a lot of game performance doesn't come from the game code. It comes from the engine itself being optimized to work across multiple cores. If the engine knows how to efficiently allocate its workload across the available resources, the game shouldn't have a problem utilizing two cores, four cores, or even eight cores. For example how well Battlefield 3 runs on eight cores, and keeping up with Intel's best offering. Is not likely being due to EA game developers, but more over of how Frostbite 2 engine was tweaked in the CPU department. And as for the XBOX I did not know that, consoles are honestly at the very bottom of my interest list (consoles are terrible imo).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Battlefield 3 isn't really all that GPU bound, Frostbite is more CPU heavy than anything. The benchmarks posted above is done on two 7970's in crossfire, hence its a CPU bottleneck. As for games with threading, I think I already established current generation games are very few in support for over four cores. Tho future games like GTA V (if ever ported), Crysis 3, and Battlefield 4 should all support up to eight cores. AMD has a lot of room to establish themselves in the gaming market. If they can get their core performance up at least 15% from Piledriver, they may be some real competition to Intel in the near future. Intel will either have to build much faster cores (not going to gap them much from AMD) or add more cores. I doubt you will see six core Intel i5's as they would cost what an i7 costs now. If you view it from any angle, there is a lot of room for AMD to become the #1 best valued gaming processor manufacture on the market (like they have been in the past). Offering equivalent performance to Intel's best series i7 at literally half the cost. That would be their selling point, and for once they would be right back in the game. Lastly as a indie game developer, it's really not all that hard to optimize a game for multiple cores. Sure threading is hard because you have to share resources via pipes, memory, and other methods. Tho really game developers are lazy and try writing games in as few threads as possible. And you end up with games like SimCity 5 as the outcome, one of the worst games of the year.

Here is another screen from the same benchmark on much lower settings. A measly 1-4 FPS difference I personally don't call a GPU bottleneck.
BF3MediumCPUBottleneck2013.png
Just because a benchmark uses top end GPU's does not automatically make it CPU bound, especially at 5760x1080.

BF3 can be CPU bound, but only in multiplayer or with lesser CPU's. With an overclock making no difference at all, in performance, it is clearly not CPU bound in this benchmark. There is 0 improvement when the i7 or the 8350 are overclocked. If it was CPU bound, it would show improvements when you OC the CPU. It doesn't, so it isn't.
 
Last edited:
Jan 31, 2013
108
0
0
Just because a benchmark uses top end GPU's does not automatically make it CPU bound, especially at 5760x1080.

BF3 can be CPU bound, but only in multiplayer or with lesser CPU's. With an overclock making no difference at all, in performance, it is clearly not CPU bound in this benchmark. There is 0 improvement when the i7 or the 8350 are overclocked. If it was CPU bound, it would show improvements when you OC the CPU. It doesn't, so it isn't.
I think you're misinterpreting the benchmark entirely. Battlefield 3 is a CPU bound game, we know that because of Frostbite. The frame rate cut off is not from a bottleneck, its due to reaching the full potential of the GPU's. But understand this, based within that same benchmark is results showing a FX-8350 was no slower than a i7-3770k clock for clock. It was able to maintain the same frame rate across the board, which means you wouldn't get no better performance by buying Intel (whole point of my previous post). If Battlefield 3 wasn't optimized to use eight cores, the FX-8350's scores would of most likely dropped off the chart. Or am I looking at a different chart entirely?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think you're misinterpreting the benchmark entirely. Battlefield 3 is a CPU bound game, we know that because of Frostbite. The frame rate cut off is not from a bottleneck, its due to reaching the full potential of the GPU's. But understand this, based within that same benchmark is results showing a FX-8350 was no slower than a i7-3770k clock for clock. It was able to maintain the same frame rate across the board, which means you wouldn't get no better performance by buying Intel (whole point of my previous post). If Battlefield 3 wasn't optimized to use eight cores, the FX-8350's scores would of most likely dropped off the chart. Or am I looking at a different chart entirely?

You don't know crap obviously.

If the GPU's reach their full potential, as you said, that means the CPU is not bottlenecking the GPU's. A bottleneck means the GPU's can't reach their full potential. That is the meaning of a bottleneck. If the GPU's reach full potential, the CPU by definition is not a bottleneck.

BF3 can be bottlenecked on some maps or in different situations, but it was clearly NOT bottlenecked in that benchmark. If it had been, the performance would increase with a CPU overclock, as both CPU's were tested at stock and at 4.4Ghz, and there was absolutely no improvement, which clearly means there is no CPU bottleneck.
 
Jan 31, 2013
108
0
0
You don't know crap obviously.

If the GPU's reach their full potential, as you said, that means the CPU is not bottlenecking the GPU's. A bottleneck means the GPU's can't reach their full potential. That is the meaning of a bottleneck. If the GPU's reach full potential, the CPU by definition is not a bottleneck.

BF3 can be bottlenecked on some maps or in different situations, but it was clearly NOT bottlenecked in that benchmark. If it had been, the performance would increase with a CPU overclock, as both CPU's were tested at stock and at 4.4Ghz, and there was absolutely no improvement, which clearly means there is no CPU bottleneck.
Look you don't have to get all mad over being wrong. It would be a CPU bottleneck if that was what is holding back the benchmarks. In fact it's neither, when reaching full capacity on a piece of hardware that's not called "bottlenecking". I might of said it was a CPU bottleneck, but that was only to put your GPU bottleneck theory to rest (which is more wrong than the two). A bottleneck means another piece of hardware is holding the potential of another piece of hardware back. I made that very clear in my previous posts. All games can be bottlenecked, but there is more to it than just hardware. The graph shows stock clocks, and both CPU's set to 4.4GHz. The gains for the boost are only marginal, meaning the GPU's can't do no more with what either CPU has to offer.
 

Wildman107

Member
Apr 8, 2013
46
0
0
If the GPU's reach their full potential, as you said, that means the CPU is not bottlenecking the GPU's.

+1 for bystander36. That's totally true.

If the resolution was reduced along with quality settings, those benchmarks might change to show a clear winner.
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Seeing as I upgrade my GPU annually, if required, Nvidia. Better drivers, and this gen, Kepler runs cooler and uses less power. As for the eternal AMD vs Intel/AMD vs Nvidia/AMD vs the world debates, forget about average FPS and what about frame latencies? Doesn't the Intel and Nvidia combo currently offer not only better but *smoother* FPS comparatively?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
All this means is AMD's CPU market will have a much better edge in future gaming. The PlayStation 4 has eight Jaguar cores, which means every single game will be optimized for eight cores. When it comes time to port the game over to PC, AMD's FX series is going to start looking really good. A GPU is a GPU there is no threading and such when it comes to game development. You tell it what to draw, and it does as its told. How well it performs is dependent on the hardware itself and the drivers communicating with it. Tho optimizing games for multiple CPU cores is an option, being a console there is no reason for developers to not utilize all of the available resources. So really the question isn't "why would you buy next gen Nvidia?" but rather "why would you buy next gen Intel?". All of the "buy the i5 its better for gaming bro" will be a thing of the past. Four fast cores cannot beat eight slightly slower cores when the application is made to utilize up to eight cores. The next gen XBOX is going to have a similar APU as well, most likely with the same core count for cross console compatibility among game developers (easier to expect same game performance without rewriting the core of the game). AMD deserves at least to be crowned king of gaming when it comes to their CPU's (like old times). I think most people will start to see as software advances, Intel's whole IPC claims become more moot. Existing AMD FX series already crushes Intel i7 series in thread heavy workloads (the future always has been thread dependent). I don't really wanna make this a whole Intel vs AMD debate, but what i'm saying is that the only place any of these hardware manufactures are going to benefit from what console's have adopted. Is how games are going to be heavily threaded, and optimized for eight cores.

You really think intel won't respond by making processors with more cores? Seriously?

Lets be real, just like with Xbox 360 and PS3, developers will not be using anywhere near the full potential of the new PS4 or 720. They won't start developing anything using all 8 threads for awhile. By the time they do, intel will have 6-8 core CPUs that are more than affordable.

It's all speculation though. Maybe AMD might hold a small advantage for a couple of months, but I'm HIGHLY doubting intel won't have an offering ready for this when programmers start utilizing more threads. Hell, they already have very affordable 6 core processors. Don't get me wrong, I'm no intel fanboy, I buy the best product out at the time, but I really don't see intel dropping the ball again like they did with the pentium 4. This is a NEVER AGAIN, for them.

My OPINION, by the time PS4 an Xbox 720 start utilizing 8 threads, intel will be well beyond that point.
On a side note: Aren't we due for some Conroe type performance jump in the CPU or GPU industry? Seriously, help us out AMD/Intel/Nvidia....
 
Jan 31, 2013
108
0
0
You really think intel won't respond by making processors with more cores? Seriously?

Lets be real, just like with Xbox 360 and PS3, developers will not be using anywhere near the full potential of the new PS4 or 720. They won't start developing anything using all 8 threads for awhile. By the time they do, intel will have 6-8 core CPUs that are more than affordable.

It's all speculation though. Maybe AMD might hold a small advantage for a couple of months, but I'm HIGHLY doubting intel won't have an offering ready for this when programmers start utilizing more threads. Hell, they already have very affordable 6 core processors. Don't get me wrong, I'm no intel fanboy, I buy the best product out at the time, but I really don't see intel dropping the ball again like they did with the pentium 4. This is a NEVER AGAIN, for them.

My OPINION, by the time PS4 an Xbox 720 start utilizing 8 threads, intel will be well beyond that point.
On a side note: Aren't we due for some Conroe type performance jump in the CPU or GPU industry? Seriously, help us out AMD/Intel/Nvidia....
You do know its standard for us developers to utilized every resource available on a console. For the fact every other unit runs the same exact hardware. I don't see Intel offering up 3960x's for $300 any time soon either.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Why? Because Nvidia makes great GPUs and because I like the features and possibilities.
I don't think that AMD having the consoles will change much if anything at all. Performance wise, that is. Maybe GK104 and below will have problems since AMD could push SSAA and leverage their GFLOPs, but since the question is about Maxwell, I don't think we have a problem.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
Seeing as I upgrade my GPU annually, if required, Nvidia. Better drivers, and this gen, Kepler runs cooler and uses less power. As for the eternal AMD vs Intel/AMD vs Nvidia/AMD vs the world debates, forget about average FPS and what about frame latencies? Doesn't the Intel and Nvidia combo currently offer not only better but *smoother* FPS comparatively?

Nope, the 7970ghz is faster and smoother for the majority of games, while also being cheaper than the 680. Once AMD patched up the frame latencies, Pcper pretty much started to ignore the single GPU comparisions, because they made the 680 look even worse. The power difference is also only around 20-30w under load.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
With the Xbox reveal just a few days away, it has been confirmed that AMD will be powering all three next gen consoles. Microsoft and Sony have stated in the past that these next gen consoles are being developed for the next 10 years of gaming. That means for 10 years, developers making exclusive or cross platform titles will be coding for AMD hardware first, everyone else second.

With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?

Nvidia claimed they weren't interested in a PS4 design win due to slim margins, but isn't the opportunity cost greater than "slim margins" with AMD now having a monopoly?

What are your thoughts?

Disclaimer - I'm just a gamer. I'm not a programmer or developer and clearly ignorant to how a game goes from conception to market. I'm only inquiring.

It will make zero difference. But even if we play along and imagine it does. It will only do so on current AMD cards. A new uarch from AMD and all the hypothetical bonus would be gone.

PS3 and Xbox360 had no difference at all between AMD(ATI) and nVidia cards.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
Obviously a lot of the close to metal stuff isn't going to translate to PC graphics cards unless AMD can develop some sort of common low level API that runs across across all three platforms.

On the other hand, you'd have to be a pretty wishful Nvidia fanboy if you can't see how a lot of the high level stuff will transfer. Forward+, tessellation at the subdivision factors where GCN has a big advantage, writing compute and shader code with GCN's strengths and weaknesses in mind, general resource allocation, and such.

It's a big advantage, but it will only becomes insurmountable for Nvidia if AMD figures out a way to get the low level optimization done on the console side to carry over to PC.

I have two predictions:

1. AMD is going to cut R&D in the hope that developers will target the existing architecture, that games will become better optimized as time goes on, and that they can do just fine making only minor changes. The main focus in future generations will be scaling up the subsequent one as each new node allows more similar functional units in the same space.

2. Nvidia is going to focus far more on any perceived weakness in their products where AMD is also strong, as such areas will end up as major bottlenecks when GCN becomes the default target platform. Future GPUs will become more GCN like while Nvidia looks to develop features that can easily be tacked on by developers. For example, a better filtering method than current anisotropic filtering.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
You do know its standard for us developers to utilized every resource available on a console. For the fact every other unit runs the same exact hardware. I don't see Intel offering up 3960x's for $300 any time soon either.

I'm not really sure I understand this. As consoles go on longer and longer we see games get much better utilization of the hardware.
If it is as you say, then we wouldn't see this improvement and we'd see games at launch taking full advantage of hardware/resources.