Why would you buy next gen Nvidia?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Because Intel is incapable of making eight core consumer CPUs?

Did I even mention Intel?

I never said it would be bad for Intel. Or, better than Intel. An 8 thread desktop though, regardless of brand, shouldn't run into any bottlenecks because the game only uses a few threads.
 

lagokc

Senior member
Mar 27, 2013
808
1
41
Beyond having an x86 CPU and a GPU based on GCN, the new Xbox is even rumored to be running a lite version of Windows 8.

I don't think future ports will be as terrible as your implying with this next generation of consoles.

It doesn't matter what internal hardware or OS consoles run, console games are designed for the controller in mind as opposed to the mouse and keyboard. The best games throughout time simply do not lend themselves to being designed with a controller in mind (Total War games, Baldur's Gate series, pretty much all FPS) and when companies attempt to get them to work as a console game the PC version always suffers for it.
 

lagokc

Senior member
Mar 27, 2013
808
1
41
the consoles are not dealing with PC DX and Drivers...

I'm actually kind of hoping that developers will start to abandon DX for OpenGL. Less reliance on Windows is something we should want now that it seems Phaëton is driving Microsoft...
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I'm actually kind of hoping that developers will start to abandon DX for OpenGL. Less reliance on Windows is something we should want now that it seems Phaëton is driving Microsoft...

Not gonna happen. And the OpenGL community can only blame themselves.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
AMD will likely have a new architecture out by the time dev's start really taking advantage of the PS4's hardware.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Look you don't have to get all mad over being wrong. It would be a CPU bottleneck if that was what is holding back the benchmarks. In fact it's neither, when reaching full capacity on a piece of hardware that's not called "bottlenecking". I might of said it was a CPU bottleneck, but that was only to put your GPU bottleneck theory to rest (which is more wrong than the two). A bottleneck means another piece of hardware is holding the potential of another piece of hardware back. I made that very clear in my previous posts. All games can be bottlenecked, but there is more to it than just hardware. The graph shows stock clocks, and both CPU's set to 4.4GHz. The gains for the boost are only marginal, meaning the GPU's can't do no more with what either CPU has to offer.

Just stop, learn a little before you continue. Hell, just read the review you got that benchmark from. Even it tells you fairly clearly that the GPU's are holding back the performance with this:
http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-13.html
Historically, we’ve seen AMD’s processors bottleneck the performance of certain games at low resolutions and mainstream quality settings. Using a GeForce [FONT=inherit !important][FONT=inherit ! important]GTX[/FONT][/FONT] 680 at Battlefield 3’s Ultra quality preset, however, reveals no such limitation (even with anti-aliasing disabled completely).

Of course, this only applies to the single-player campaign, which tends to be GPU-heavy. The multi-player element of Battlefield 3 is more taxing on [FONT=inherit !important][FONT=inherit ! important]processor[/FONT][/FONT]. But because it’s difficult to create a repeatable benchmark involving 63 other players, we’ll move on to another game notorious for its emphasis on CPU speed.
Notice it notes there is NO CPU bottleneck. Also note that it mentions that BF3 has no issues in singleplayer in this regard. The only CPU bottlenecks with BF3 are in multiplayer, something your chart is not showing.

http://www.tomshardware.com/reviews/battlefield-3-graphics-performance,3063-13.html
Looking back at my notes for the [FONT=inherit !important][FONT=inherit !important]Bulldozer[/FONT][/FONT] launch (AMD Bulldozer Review: FX-8150 Gets Tested), AMD was very enthusiastic about FX’s performance in Battlefield 3 (multiplayer beta, at the time). And no wonder—Battlefield 3's single-player campaign doesn’t care if you’re using a $130 Core i3 or $315 Core i7. It doesn’t care if you come armed with two Hyper-Threaded cores or four Bulldozer modules. It just. Doesn’t. Care.
In fact, after getting a little overzealous swapping out Lynnfield-, Clarkdale-, and [FONT=inherit !important][FONT=inherit ! important]Sandy[/FONT][/FONT]-based chips, I tried one AMD CPU and decided to call it a day. Any reasonably-modern processor is going to be held back by graphics long before hamstringing performance itself.
http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407-5.html
AMD’s lower-cost FX-8350 continues to maintain performance parity in Battlefield 3, even as our highest resolution and detail settings lean hard against a pair of Radeon HD 7970s.
They all say the same thing over and over. In single player BF3, the games are GPU bound and it doesn't matter what CPU you use. Heck, one of those even shows that even 2 cores can equal 6 cores as well.

And since you changed your mind on it being CPU bottlenecked and now claiming the game isn't GPU bound, how do you explain all other benchmarks that show faster GPU's continue to improve performance?
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/6.html
bf3_2560_1600.gif

(interestingly, at 1920x1200, with 3-way SLI or CF, a CPU bottleneck starts to occur).
 
Last edited:

Lavans

Member
Sep 21, 2010
139
0
0
To answer the original post - more flexible driver controls. If you do any sort of driver tweaks, Nvidia should be your first consideration.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?
Because nVidia provides the solutions to my requirements just a little better and closer than AMD does.

But good on AMD for scoring those three wins - the red team needs some money to give us the benefits of competition. :thumbsup:
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
"Why would you buy next gen Nvidia? "

I wouldn't.. not after having an excellent experience with my current Radeon 5870. No problems, still going strong, none of the driver problems my last Geforce had- that's how you earn another purchase from a customer.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
I've bought both Nvidia and ATI (Woops AMD:biggrin:). I've now become partial to Nvidia because of the ease of drivers. No knock on AMD products, just a personal preference.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
^ LOL.

"I prefer AMD due to ease of drivers." obsoleet @ 10:25

"I prefer Nvidia due to ease of drivers." guskline @ 10:36

And there you have it OP.
 

darckhart

Senior member
Jul 6, 2004
517
2
81
as others have said, performance of the console and pc versions of the same game do not necessarily say anything about each other, even with the hardware "becoming more like pc" for consoles this next generation. therefore, if one is looking for nv or amd on the pc, it's purely considered from that use case alone.

having owned numerous amd and nv cards, i find that i have less day-to-day annoyances with nv than amd across all usage scenarios (gaming, office work, video, etc). therefore, my preference will be to choose nv, unless 1) amd offers a significantly greater performance difference in the cycle (eg, GTX700 series vs Volcanic Islands), or 2) nv prices things stupidly expensive as compared to amd direct competition models (eg, GTX680 vs 7970).
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?

Console and PC are different platforms. Similar thoughts were offered when AMD/ATI did win console contracts years ago.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
According to this article (http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall), Killzone for PS4 uses 3GB of its memory for its graphical components. My take on this is that if a first gen game for PS4 already uses that much memory for 1080P, anybody that has 1440P/1600P will need at least 6GB or more to be 'future proof'.

After having seen the game, not impressed. Crysis 3 and Battlefield 3 both look better to me and neither use 3GB, or even 2GB in most cases. Not that it won't be a nice looking game or even a good shooter for a console. I just don't see anything that is as impressive as a game that has already been released. I don't know what UE4 will bring, but from what Epic has stated they did have to revise their demos to fit within the performance constraints of the PS4. Not a lot, but some small things might be noticeable to the trained eye and it will only go from there with newer and faster GPUs for the PC being able to do more at higher FPS, higher resolution, and with more AA. You aren't going to make miracles. Nobody builds a 4GB 6950 because for what the 6950 can do more memory wouldn't help make it faster. Every piece of hardware has a limit as to what it can do as far as polygon counts, complexity of lighting and shaders, post process filtering etc etc. You can make things effecient but eventually you hit a wall where it just simply cannot do more. This is why I think many developers are claiming 30fps is fine. I don't think it's fine for most titles, but I believe it's due to being able to make the game just a bit more graphically impressive by freeing up some of the power envelope that would have been used to push for 60fps. On the PC these efficiency adjustments seem to come from driver teams working to get new titles running on their respective hardware in an optimal way.


AMD will likely have a new architecture out by the time dev's start really taking advantage of the PS4's hardware.

This is true as well. When we see games go from the level of Uncharted 1 to Uncharted 3 like we saw with the PS3, we will be on GPUs likely a generation if not two ahead. The brute force approach would have surpassed any optimizations to be made in the PS4 and Xbox. We already have a GTX Titan with 6GB of memory. There will be no real benefits on the hardware side at all. As for the CPU, we don't know the future but I can bet that when the time comes Intel has a roadmap for 8cores or even more perhaps sometime down the line when the software side catches up to what the CPUs have available.

Nope, the 7970ghz is faster and smoother for the majority of games, while also being cheaper than the 680. Once AMD patched up the frame latencies, Pcper pretty much started to ignore the single GPU comparisions, because they made the 680 look even worse. The power difference is also only around 20-30w under load.

Faster than what? Remember there is a market for Titan level cards. The sales show this. There are people willing to pay for the latest and greatest GPUs even at the prices we see them at today. Yes it's niche but that niche is occupied solely by Nvidia.
 
Last edited:

Siberian

Senior member
Jul 10, 2012
258
0
0
Consoles sure did not help the PowerPC architecture. Having a chip in the 360 did not seem to give AMD any advantage. Consoles are becoming set top box media servers, that also play games.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
The fallacy in the original post is that game developers will automatically focus most of their resources on dedicated game consoles powered by AMD graphics. The trend moving forward is that game developers will increasingly devote more and more resources towards mobile gaming on open platforms, and game developers will try to take advantage of these huge ecosystems by providing free-to-play games or low cost games that reach a huge audience rather than a restricted audience. And for those game developers who want to push forward bleeding edge graphics quality, PC gaming on open platforms will generally be the preferred choice. Within the next 1-3 years, PC graphics technology will be well ahead of what is possible in Playstation Next or Xbox Next.

I dont know about that,video card development has slowed to a snails pace. The past 2 years have showed virtually no massive increase in graphics power. Moore's law for the desktop is dead. Mobile is where all the emphasis is.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I dont know about that,video card development has slowed to a snails pace. The past 2 years have showed virtually no massive increase in graphics power. Moore's law for the desktop is dead. Mobile is where all the emphasis is.

However, is mobile where the money is? I'd tend to say no as I don't know of anyone who has paid more than a couple dollars for a game and has never bought virtual items inside any game but rather just quit playing when they reach a point that artificially prevents them from continuing. Most people I know just play free games and usually drop it completely after 10-15 minutes, moving on to the next. When you have high production values you have high costs. Then you must charge a reasonable sum for the game at release. Will people buy a $40 game on their tablet and buy a controller for it too? Doubtful, not when you have thousands of games priced at $2.99 or less.

Is the future of gaming one where everyone seemingly has a form of ADD and can't finish a 10, 20, or 40 hour experience and every game has zero production value associated with it? I don't think so.

Personally I think the industry needs a crash. Then all the casuals can go on to the next fad and all the studios trying to cash in with yearly releases can go belly up and a selection of dedicated developers and gamers remain. Only then will things feel fresh again. I don't think it'll happen, it's too big an industry now, but I sort of feel that is what we need.
 
Last edited:

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Console GPU purchases have nothing to do with PC purchases. Even if AMD gets a influx of money for this, if anything it will hurt the PC portion of GPU business since they will be dedicated to pumping out for console units.

I hate to say this, but consoles are terrible for PC business, it pools to much resources away from tech advances on PC front. All have seen the last few years is GPUs just getting slightly faster on slightly better hardware. Nothing that leaps and bounds blows stuff out of the water like early advances did.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Console GPU purchases have nothing to do with PC purchases. Even if AMD gets a influx of money for this, if anything it will hurt the PC portion of GPU business since they will be dedicated to pumping out for console units.

I hate to say this, but consoles are terrible for PC business, it pools to much resources away from tech advances on PC front. All have seen the last few years is GPUs just getting slightly faster on slightly better hardware. Nothing that leaps and bounds blows stuff out of the water like early advances did.

I'm not sure where you draw this conclusion from. The R&D for the console APU's will most definitely find it's way into the PC. Also, the added volume of APU's at the foundries will help AMD fulfill their obligations and will likely also get them some volume and preferential customer status. That's going to be a lot of volume (don't forget X box as well).

Don't you think that Titan (except for price) blows Fermi out of the water?