• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why would you buy next gen Nvidia?

With the Xbox reveal just a few days away, it has been confirmed that AMD will be powering all three next gen consoles. Microsoft and Sony have stated in the past that these next gen consoles are being developed for the next 10 years of gaming. That means for 10 years, developers making exclusive or cross platform titles will be coding for AMD hardware first, everyone else second.

With that in mind, why would a PC Gamer want an Nvidia GeForce card over an AMD Radeon for the next generation of video card releases?

Nvidia claimed they weren't interested in a PS4 design win due to slim margins, but isn't the opportunity cost greater than "slim margins" with AMD now having a monopoly?

What are your thoughts?

Disclaimer - I'm just a gamer. I'm not a programmer or developer and clearly ignorant to how a game goes from conception to market. I'm only inquiring.
 
I think it's safe to say that the xbox 360 that had an AMD GPU and came out prior to the PS3, "won" in regards to hardware and software sales. Yet it did not seem to affect the PC side of gaming much at all. Having all 3 consoles might help short term, but when 20nm (and 16nm) cards come, the performance and architectural differences on both sides will be substantial enough that coding for the PS4 and/or Xbox 3 won't mean automatic optimizations for PC AMD video cards.
 
Because I like the name 😛 But seriously it seems 7970 GHZ will be fastest desktop sku for AMD until late 2013 and people need something new and shiny.There are many games where a fast GPU is very desirable like Metro LL to enable all the eye candy.Just because AMD is in PS4/Xbox next doesn't necessarily mean the PC port will run faster in AMD gpu.
 
I think it's safe to say that the xbox 360 that had an AMD GPU and came out prior to the PS3, "won" in regards to hardware and software sales. Yet it did not seem to affect the PC side of gaming much at all. Having all 3 consoles might help short term, but when 20nm (and 16nm) cards come, the performance and architectural differences on both sides will be substantial enough that coding for the PS4 and/or Xbox 3 won't mean automatic optimizations for PC AMD video cards.

Ya we're seeing brand new architectures next year that are going to be way different than what AMD is putting in their next gen consoles. Makes sense to me as well. Short term? Possibly. After new architectures? Probably not.
 
Even with gaming evolved titles it isn't uncommon to see them running better on Nvidia hardware. I don't really see future titles running poorly on Nvidia just because AMD are making the console parts.
 
I think it's safe to say that the xbox 360 that had an AMD GPU and came out prior to the PS3, "won" in regards to hardware and software sales. Yet it did not seem to affect the PC side of gaming much at all. Having all 3 consoles might help short term, but when 20nm (and 16nm) cards come, the performance and architectural differences on both sides will be substantial enough that coding for the PS4 and/or Xbox 3 won't mean automatic optimizations for PC AMD video cards.

The 360/PS3 were not x86 though. The new consoles are. Which means porting a game from console to PC is going to be much less work. And some optimizations that are made for GCN on the console, will translate to GCN on the PC.

But I think the biggest thing that will affect PC performance is the fact that x86 code on the consoles won't have to be re-optimized for x86 PC's.
 
Most likely you won't see any AMD GPU benefits at all.
Maybe we'll see multi threaded benefits though as console programmers will try to make use of the multithreaded CPUs. However, since both intel and AMD have cpus that can handle as many threads as the console gens, we'll just see increased performance. I don't think AMD will get any large benefit or any benefit really beyond increased profits, which they desperately need. They better hope both the PS4 and the new Xbox are successes.
 
The fallacy in the original post is that game developers will automatically focus most of their resources on dedicated game consoles powered by AMD graphics. The trend moving forward is that game developers will increasingly devote more and more resources towards mobile gaming on open platforms, and game developers will try to take advantage of these huge ecosystems by providing free-to-play games or low cost games that reach a huge audience rather than a restricted audience. And for those game developers who want to push forward bleeding edge graphics quality, PC gaming on open platforms will generally be the preferred choice. Within the next 1-3 years, PC graphics technology will be well ahead of what is possible in Playstation Next or Xbox Next.
 
That means for 10 years, developers making exclusive or cross platform titles will be coding for AMD hardware first, everyone else second.

Not really. Many will be coding for the Unreal 4 engine or some other engine. Assuming the engine authors do their job, the engine will offer good support PC drivers from both nviida and AMD. Then both AMD and nvidia can do the usual per-game optimizations for SLI/CF.

Also, "sold for 10 years" and "the main console for 10 years" are two different things.

And, nvidia did not have an x86 chip and ARM is too slow, so they had nothing comparable to offer at any price.
 
Actually I feel exactly the opposite. Because console developers will be using AMD hardware in consoles they will do less to make sure it runs well on AMD PC cards because they figure it should port easily and Nvidia will do MORE work with developers to make sure that titles run correctly on their cards to keep people buying their cards.
Hence games will run Better on Nvidia hardware.

Think about it this way.. Nvidia has MORE reason to work with developers (which they have an awesome track record for) then ever before. AMD has less.
 
Last edited:
I don't think today's 28nm GPUs are good for "future-proofing", but I do believe the 20nm GPUs will have some headroom over the consoles. And if games do show an advantage on AMD's GPUs, I hope Nvidia will work more with the developers to optimize for their GPUs as well
I think any GPU with less than 3GB VRAM will age poorly.


I've seen before that some say the optimizations for the 360's GPU didn't carry over to AMD's PC GPUs, but the 360's GPU was apparently pretty different from those
 
I don't think today's 28nm GPUs are good for "future-proofing", but I do believe the 20nm GPUs will have some headroom over the consoles. And if games do show an advantage on AMD's GPUs, I hope Nvidia will work more with the developers to optimize for their GPUs as well
I think any GPU with less than 3GB VRAM will age poorly.


I've seen before that some say the optimizations for the 360's GPU didn't carry over to AMD's PC GPUs, but the 360's GPU was apparently pretty different from those

If the past is any indication, there will not be any issues with low VRAM, as it takes more power than older GPU's have to use the settings that need more VRAM. The only times that doesn't hold true is with extreme resolutions, but even then it would be rare.
 
Actually I feel exactly the opposite. Because console developers will be using AMD hardware in consoles they will do less to make sure it runs well on AMD PC cards because they figure it should port easily and Nvidia will do MORE work with developers to make sure that titles run correctly on their cards to keep people buying their cards.
Hence games will run Better on Nvidia hardware.

Think about it this way.. Nvidia has MORE reason to work with developers (which they have an awesome track record for) then ever before. AMD has less.

That's a stupid way of putting it lol.
 
If the past is any indication, there will not be any issues with low VRAM, as it takes more power than older GPU's have to use the settings that need more VRAM. The only times that doesn't hold true is with extreme resolutions, but even then it would be rare.

Furthermore while 7950s and above have more than 2gig MOST cards are bought at a lower price point being 7870s and below. No developer is going to alienate the majority of their market.
In addition the target for TVs is 1080p. So super high res textures are not necessary.
 
If the past is any indication, there will not be any issues with low VRAM, as it takes more power than older GPU's have to use the settings that need more VRAM. The only times that doesn't hold true is with extreme resolutions, but even then it would be rare.

I'd say the 8800GT 256MB aged pretty poorly compared with it's 512MB brother
 
I'd say the 8800GT 256MB aged pretty poorly compared with it's 512MB brother

I don't recall the comparisons of that card, but one thing that can skew your image of these cards is when review sites use unnecessarily high levels of AA and settings that even their double vram counterparts cannot get good performance with.

You see review sites do this a lot, using settings that the very top in graphics cards just get decent FPS, and continue showing all the lesser cards getting FPS that aren't playable, no matter how much VRAM they have.
 
jimmy-fallon-popcorn.gif
 
Gonna wait till 20nm GPUs for my next cash splash,and with gods help*laughs*will wait till both show their cards😛.
 
According to this article (http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall), Killzone for PS4 uses 3GB of its memory for its graphical components. My take on this is that if a first gen game for PS4 already uses that much memory for 1080P, anybody that has 1440P/1600P will need at least 6GB or more to be 'future proof'.

1440P/1600P will need at least 6GB or more to be 'future proof',jesus fucking wept hahahah.cheer me up.LOL *slap*.
 
Its unlikely to make a lot of difference. PC graphics cards already outperform the console equivalent and a couple of doublings and it really wont make any difference at all. Both AMD and NVidia will have left the console architecture long behind in 4 years and both will be somewhere quite different to the hardware that is in consoles today. Any advantage will be very short lived.
 
Back
Top