[Recanted] All Frostbite 3 Titles to Ship Optimized Exclusively for AMD

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Rezist

Senior member
Jun 20, 2009
726
0
71
It's not like nVidia hasn't made developers code there games to run poorly on AMD hardware like in the case of Assassins Creed DX10 patch. However AMD is not likely to do that. Most likely all we will see is some minor optimizations which will already be there since the engines are designed for consoles first.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Most likely all we will see is some minor optimizations which will already be there since the engines are designed for consoles first.

This exactly. EA isn't going to be spending any extra money optimizing for AMD desktop Radeons. They'll just be optimized for the architecture in the process of developing the title for the console. Its a positive effect, one that costs them nothing.
 

lefty2

Senior member
May 15, 2013
240
9
81
I'm just wondering if this means that Battlefied 4 will make use of unified memory that will appear in the Kavari APU. Unified memory means that the hardware can support megatextures (at least in theory)
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Looks like they updated the article..

UPDATE: EA and AMD have issued a statement clarifying that while the two companies are collaborating on day-one support for Radeon products for Battlefield 4, the partnership is non-exclusive and gamers using other components will be supported.

“DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.”
 
Aug 11, 2008
10,451
642
126
It sounds like this is a good reason to never buy an EA game at release. Buy after a price drop or two and get great performance (and patches for some of the bugs) while saving money.

I know a better reason not to buy an EA game at release--- the games themselves and the fact that you have to use origin.

After caving in and buying Mass Effect 3 at full price, I vowed to never, ever pay full price for an EA game again.
 
Feb 19, 2009
10,457
10
76
I would put the blame on NV, honestly. If they would stop crippling Compute on their cards and in there drivers, they would have less problems.

Tell me, did you rail against NV, when they pushed for more Tess. performance in games, when AMD was weak in Tess.? Or did you blame AMD for being weak in Tess.?

So when AMD is strong in Compute, you are blaming AMD, for pushing for more Compute in games? Rather than blame NV for crippling their compute in drivers, just so an NV customer has to buy a Titan to be un-throttled in Compute?

We did rail hard against the stupid tessellation abuse in Crysis 2, and a few other games, where millions of polygons are generated for flat surfaces.

As gamers, we should expect better from developers. Use features wisely, not just to cripple competitor GPUs for the sake of "exclusive optimizations".
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
In other news, my Titan/3930K won't have any issues with any upcoming port. And in all seriousness, if they optimise for AMD's crappy CPU's, then we'll finally see any quad/hexa core show big boosts anyway.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
This is exactly correct. AMD will ensure that their architecture strengths are written for and that the devs won't be distracted by Nvidia's meddling in the meantime. Hopefully we'll see more of Dirt Showdown's lighting effects and compute without Nvidia "convincing" them otherwise.

You mean halving your FPS for literally ZERO graphics improvement? That's a great idea :rolleyes:
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
You mean halving your FPS for literally ZERO graphics improvement? That's a great idea :rolleyes:

I know man. I feel the same about retarded, overdone, extremely exaggerated, out of place PhysX effects added just because "PhysX! Yeah!"
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I know man. I feel the same about retarded, overdone, extremely exaggerated, out of place PhysX effects added just because "PhysX! Yeah!"

I like physx when it's used like Batman used it. There was much more going on like debris etc. It added something to the game IMO. I'm not a fan of how it's done on the GPU though and how much a performance hit it is even with SLI.

I think it can be done more efficiently and not hardware locked.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
CPU optimizations will be for AMD CPU's only. Basically better coding for their module based chips. ie: Heavily threaded and light on FPU calculations on the CPU side. Offload the FPU stuff to the GPU.

This would be minimum level of optimization to wait.
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
I like physx when it's used like Batman used it. There was much more going on like debris etc. It added something to the game IMO. I'm not a fan of how it's done on the GPU though and how much a performance hit it is even with SLI.

I think it can be done more efficiently and not hardware locked.

I think Batman has been the only game that really used PhysX right. I switched from 680 SLI to 7970 CF when I was halfway through that game, and definitely missed it. Everything maxed out plus CPU PhysX was meh.

Not a fan of hardware locking. I like to switch between hardware, so I don't like not being able to fully enjoy something done well.
 

mindbomb

Senior member
May 30, 2013
363
0
0
I get that console games will be optimized for GCN, but I don't see how you can really optimize for jaguar in a way that won't benefit both intel and amd processors.

anyway, nvidia's driver team has their work cut out for them, having to port these optimizations to kepler.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
AMD's statement:

"It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with DICE and EA is exclusively focused on Battlefield 4 and its performance optimizations for AMD CPUs, GPUs and APUs," AMD representatives said. "Additionally, the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release.”

So its not AMD saying that NV can not have the code before release its EA saying it.
Amidst the fray of E3 reveals and gameplay demos, EA announced a new partnership with AMD that could tip the scales for the chip maker's Radeon graphics cards. Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

And the updates say there is nothing to worry about.
UPDATE: EA and AMD have issued a statement clarifying that while the two companies are collaborating on day-one support for Radeon products for Battlefield 4, the partnership is non-exclusive and gamers using other components will be supported.

“DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.”
http://uk.ign.com/articles/2013/06/18/all-frostbite-3-titles-to-ship-optimized-exclusively-for-amd
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't see why anyone was concerned or patting each other on the backs over this...

We all knew what was coming when you make a vague statement that covers 83% of the desktop cpu's and 66% of the desktop dedicated GPU market.

BRTky.jpg


EA is in it for the money.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
So its not AMD saying that NV can not have the code before release its EA saying it.

Sure they said it in the public. What they are doing behind doors is another question. And thanks to Tomb Raider we know that they prevent developers from working with nVidia and Intel. :\
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
Show us one publisher who said that he blocked AMD from having access to pre-release code.

If you are referring to tomb raider nvidia had access to the code before release. Crystal dynamics just updated right before launch and nvidia didn't get drivers out. It was most likely to prevent tress fx leaks.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
And the release code had
substantially decreased stability, image quality and performance over a build we were previously provided.

So even with access in the end the developers changed so much that it was a "lock-out" for nVidia to optimise their driver for Tomb Raider.
So AMD lied to IGN with their statement. I guess we will see the same with Thief 4 and EA games in the future. The fun part is that nVidia never did this to them.
 

Meekers

Member
Aug 4, 2012
156
1
76
The fun part is that nVidia never did this to them.

Do you actually believe that or are you just trolling again. Took me about 15 seconds to find this quote from AMD about Guild Wars 2:

GW2 is an NVIDIA TWIMTBP title. As such, AMD does not have access to the final gold build, so we will be unable to investigate CrossFire scaling until after the game launches this weekend.

Source: http://www.overclock.net/t/1297192/guild-wars-2-crossfire-7970#post_17992147

You effectively made your point here with only the second sentence. Please don't agitate unnecessarily.
--stahlhart
 
Last edited by a moderator:
Status
Not open for further replies.