[Recanted] All Frostbite 3 Titles to Ship Optimized Exclusively for AMD

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
No, this is EXACTLY what nVidia's "The Way it's Meant to be Played" program did.
Exactly the same. Nvidia was praised up and down for their dev relations, and was one of the selling points we heard over and over again for Nvidia cards due to NV's "proactive" nature.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
How low, both for AMD and EA. I don't like where this is headed...

Down the same path as TWIMTBP has gone down for years? I agree with you, I never liked the fracturing that goes on because of these dev programs. But AMD is fighting fire with fire, something many predicted they would have to do to combat NV. Let's just hope AMD does not start putting in vendor ID checks and disabling features completely even though they work fine on GeForce cards, like Nvidia has a habit of doing when the game detects Radeon hardware.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
It sounds like this is a good reason to never buy an EA game at release. Buy after a price drop or two and get great performance (and patches for some of the bugs) while saving money.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
http://hothardware.com/News/Indepth...Shows-Highly-Questionable-Tessellation-Usage/

No reason to believe Crysis 2 was the only game where certain choices were made to differentiate Nvidia from AMD.

People still see something sinister in that? Crysis 2 wasn't even DX11 at release, what some see as Nvidia's evil hand others see a clearly hasty patch work to incorporate DX11.

Crysis 3 should be proof of that, Nvidia is still better than AMD at tess and yet even with Nvidia in dev relations for almost it's entire development cycle it still runs fine on both sides.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0

Granseth

Senior member
May 6, 2009
258
0
71
(...)

I have no problem with it, first and foremost I'm not running nVidia right now anyways. Secondly "optimized" is just marketing BS. Third I have little concern over the quality of nVidia's driver team. And lastly, it strikes me as highly unlikely that nVidia won't see the code, you'd have to be a moron to lock out 65% of the PC market's dedicated gamers making your game look terrible at release.

I think you are mostly right. But we might see in some games that nVidia is one driver revision away from optimized drivers. And then in some games both AMD and nVidia will be miles away from optimal drivers.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
You guys love to make things red vs green.

Most games are targeted at the console market 1st. If you buy into the next gen consoles tben wouldn't you expect the code to be optimized for the hardware. If you developed the game would you strip it out on a pc port?

This go around being in the next gen consoles will most likely be more valuable than Nvidia thought.

There is always brute force to overcome the optimizatios. :)
 

brandonb

Diamond Member
Oct 17, 2006
3,731
2
0
Down the same path as TWIMTBP has gone down for years? I agree with you, I never liked the fracturing that goes on because of these dev programs. But AMD is fighting fire with fire, something many predicted they would have to do to combat NV. Let's just hope AMD does not start putting in vendor ID checks and disabling features completely even though they work fine on GeForce cards, like Nvidia has a habit of doing when the game detects Radeon hardware.

VendorID checks are lame. Which is why I buy AMD and not NV at this point in time. (I bought AMD because of the vendorid checks in PhysX. Its a matter of principle) I hope it does not come to AMD, otherwise I'll have to decide which features I want in my gameplay experience. "PhysX or superhair" (or whatever AMD creates).
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
People who are kinda blah about FXAA and especially the crap that is TXAA.

What about SMAA? watch the 2nd half of this movie, similar performance to FXAA but much sharper

You can download the files needed to inject SMAA into most games for free with injectSMAA or SweetFX.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
I think you are mostly right. But we might see in some games that nVidia is one driver revision away from optimized drivers. And then in some games both AMD and nVidia will be miles away from optimal drivers.

One only needs to look at blizzard SC2 and WoW to see an extreme example of things being Intel and NVIDIA optimized.

This potentially might have a bigger impact in CPUs than actually GPUs as we already see games being optimized for NVIDIA or AMD and each other soon catch the other.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
People still see something sinister in that? Crysis 2 wasn't even DX11 at release, what some see as Nvidia's evil hand others see a clearly hasty patch work to incorporate DX11.

Crysis 3 should be proof of that, Nvidia is still better than AMD at tess and yet even with Nvidia in dev relations for almost it's entire development cycle it still runs fine on both sides.

3 months between launch and update, the road dividers could just be enthusiasm for tessellation but the underground tessellated ocean was never adequately explained.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It sounds like this is a good reason to never buy an EA game at release. Buy after a price drop or two and get great performance (and patches for some of the bugs) while saving money.

Kinda odd though by EA considering most of the PC market share is dominated by nVidia and Intel.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It was only a matter of time with all the chum in the water.


I have no problem with it, first and foremost I'm not running nVidia right now anyways. Secondly "optimized" is just marketing BS. Third I have little concern over the quality of nVidia's driver team. And lastly, it strikes me as highly unlikely that nVidia won't see the code, you'd have to be a moron to lock out 65% of the PC market's dedicated gamers making your game look terrible at release.

Pretty much says it all. Nvidia will have access to the game to do their thing. So when the game drops, they can then release their driver. They may not be able to release a driver prior but I don't know how EA could prevent them from doing anything. Like If Nvidia said "screw it" and just dropped it a week before the game launched what would EA do? Cry?
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
" We have no doubt that the Bokeh filter and the GPU Water Simulation options could have been executed successfully on AMD’s Radeon HD 5000 series of GPUs. That the developers chose NVIDIA’s CUDA technology over Microsoft DirectCompute or even OpenCL is probably due to the fact that NVIDIA’s developer relations team worked with Avalanche Studios developers, and of course they like to promote their own products. (We would surely love to see the contract between the two, but that will never happen.) It is certainly their right to do so, just as it is Avalanche’s right to choose whatever API they want to use. We would certainly not presume to tell any independent game developer how to design their own game, but we would suggest that a more open alternative (such as OpenCL or DirectCompute) would have been preferred by us for those gamers without CUDA compatible hardware.

This is an old argument, and is basically analogous to the adoption of PhysX as opposed to a more broadly compatible physics library. NVIDIA wants to increase its side of the GPU business by giving its customers a "tangible" advantage in as many games as possible, while gamers without NVIDIA hardware would prefer that game developers had not forget about them."

http://www.hardocp.com/article/2010/05/04/just_cause_2_gameplay_performance_image_quality/7

It's not as if it wasn't mentioned for years that if AMD pursued similar game promotional methods that we'd start seeing what has started to occur, game performance being more brand oriented.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
5,657
1,851
136
But Nvidia only asks to add Physx, AMD wants locked performance.

Maybe if the PhysX effects could not possibly be done in another fashion. The net result was that graphical effects that were available in other games mysteriously "required" PhysX for any The way it's meant to be played games. NVidia was basically locking out AMD from some graphical effects that clearly can be done on AMD cards. It's the same thing that AMD is going for right now. The exact same thing. I disagreed with it when nVidia did it and I'm disagreeing with AMD's decision. I understand AMD's position, but I disagree with it. The losers in the end are the gamers.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Worst case is FB3 could implement some compute in the game engine, similar to Dirt:Showdown, which isn't even as bad of a situation as Crysis 2 (or Metro:LL). I envision this being more "AMD enhancing" than TWIMTBP's "Let's all lose performance so NV comes out on top".

Maybe if the PhysX effects could not possibly be done in another fashion. The net result was that graphical effects that were available in other games mysteriously "required" PhysX for any The way it's meant to be played games. NVidia was basically locking out AMD from some graphical effects that clearly can be done on AMD cards. It's the same thing that AMD is going for right now. The exact same thing. I disagreed with it when nVidia did it and I'm disagreeing with AMD's decision. I understand AMD's position, but I disagree with it. The losers in the end are the gamers.

But AMD has no proprietary technology, they prefer OpenCL. Now, they have a distinct compute performance advantage, so if they do add compute into the game engine it's not like it won't work on NV hardware, it just will give AMD an edge since GCN is much better at compute than kepler.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
It was only a matter of time with all the chum in the water.


I have no problem with it, first and foremost I'm not running nVidia right now anyways. Secondly "optimized" is just marketing BS. Third I have little concern over the quality of nVidia's driver team. And lastly, it strikes me as highly unlikely that nVidia won't see the code, you'd have to be a moron to lock out 65% of the PC market's dedicated gamers making your game look terrible at release.

Stop talking so much sense! This is time or people to freak out and make crazy statements complaining about something. If we stop this and start educating people how will news sites generate so many page hits and make ad revenue? Do you want to drive them out of business you cruel cruel man!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Now, they have a distinct compute performance advantage, so if they do add compute into the game engine it's not like it won't work on NV hardware, it just will give AMD an edge since GCN is much better at compute than kepler.

And that will be looked at as the AMD crippling games running with NV gpu.

In game A this nv gpu have the same performance as this amd gpu.
In game B nv gpu have half the frames of the amd gpu.
That will be called crippling performance on competitor hardware out of the gate. Not improving their own performance by making use of all this compute available.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
And that will be looked at as the AMD crippling games running with NV gpu.

Yes, which is why it's a moral quandary to say the least. Crysis 2, for example, was morally wrong - reduce everyone's performance, but reduce Nvidia's less. Adding compute, on the other hand, would speed up everyone's performance, while (presumably) giving AMD an advantage with their superior compute architecture. Personally, I think this is perfectly valid as Nvidia removed compute with Kepler to save costs, and as a result of their fat margins they will pay the price when it comes to performance. But this is all just personal opinion, and presumption that adding compute would help AMD more than Nvidia.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
And that will be looked at as the AMD crippling games running with NV gpu.

In game A this nv gpu have the same performance as this amd gpu.
In game B nv gpu have half the frames of the amd gpu.
That will be called crippling performance on competitor hardware out of the gate. Not improving their own performance by making use of all this compute available.

How so? OpenCL is an open, publicly available API. AMD is in no way locking out nVidia from using the exact same calls. It is not AMD's fault that nVidia decided to gimp their GPU's by removing the majority of their compute power.

And chances are the compute stuff will be used for optional graphics, much like PhysX was. Although its possible some game engine stuff will be done on the GPU (As a result of consoles doing it), which could hurt nVidia. But it would be nVidia's fault for not having more compute power.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Years of Nvidia's underhand tactics come back to haunt them and Nvidia fans lose their rags lol.

Any way to give this thread 6 stars?
 
Status
Not open for further replies.