[Recanted] All Frostbite 3 Titles to Ship Optimized Exclusively for AMD

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
How so? OpenCL is an open, publicly available API. AMD is in no way locking out nVidia from using the exact same calls. It is not AMD's fault that nVidia decided to gimp their GPU's by removing the majority of their compute power.

And chances are the compute stuff will be used for optional graphics, much like PhysX was. Although its possible some game engine stuff will be done on the GPU (As a result of consoles doing it), which could hurt nVidia. But it would be nVidia's fault for not having more compute power.

This is exactly correct. AMD will ensure that their architecture strengths are written for and that the devs won't be distracted by Nvidia's meddling in the meantime. Hopefully we'll see more of Dirt Showdown's lighting effects and compute without Nvidia "convincing" them otherwise.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
This is exactly correct. AMD will ensure that their architecture strengths are written for and that the devs won't be distracted by Nvidia's meddling in the meantime. Hopefully we'll see more of Dirt Showdown's lighting effects and compute without Nvidia "convincing" them otherwise.

I think this will almost be necessary given that the PS4 is supposed to have 4CU's dedicated just to compute and XB1 and PS4 are both GCN. It would be silly NOT to use compute, given that it could be status quo on console and easily brought over to PC.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
This is exactly correct. AMD will ensure that their architecture strengths are written for and that the devs won't be distracted by Nvidia's meddling in the meantime. Hopefully we'll see more of Dirt Showdown's lighting effects and compute without Nvidia "convincing" them otherwise.

Yes, because AMD is going to hang over every studios shoulders and tell them how to design their game and what features to use and how often to use them. :thumbsup:

It's called Marketing Hype for a reason.

The strengths of GCN today might be the weaknesses of GCN 2.0 next year, the only constant is that gpu's are essentially the same thing from vendor to vendor, some do one thing better than the other but outside proprietary elements everyone can do the same thing as everyone else, they're still working within DX11.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
It should also be noted that putting more emphasis on compute is part of AMD's long-term strategy as it helps to mitigate a lack of bandwidth. This should help lower spec cards and especially APU's which are bandwidth starved. In the long run this will help to increase performance and performance per Watt quite drastically.
 

Siberian

Senior member
Jul 10, 2012
258
0
0
A desperate move by AMD. Instead of bribing developers to limit performance on Intel and NVIDIA parts, they should spend it improving their own hardware and drivers. I'm not surprised that EA is involved, they have a habit of screwing their customers.

All this in order to boost benchmarks at launch. In the long run this does nothing for their customers :thumbsdown:
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Yes, because AMD is going to hang over every studios shoulders and tell them how to design their game and what features to use and how often to use them. :thumbsup:

AMD has had people available to the top 100 gaming studios since Gaming Evolved. This is what they do - how else do you think TressFX in Tomb Raider or global illumination in Showdown happened?

The strengths of GCN today might be the weaknesses of GCN 2.0 next year, the only constant is that gpu's are essentially the same thing from vendor to vendor, some do one thing better than the other but outside proprietary elements everyone can do the same thing as everyone else, they're still working within DX11.
Yes and Kepler's weakness is in compute, which just happens to be the direction that AMD is heading in as fast as they can go.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
AMD has had people available to the top 100 gaming studios since Gaming Evolved. This is what they do - how else do you think TressFX in Tomb Raider or global illumination in Showdown happened?

Yes and Kepler's weakness is in compute, which just happens to be the direction that AMD is heading in as fast as they can go.

Minor effects in an otherwise ocean of choices the devs themselves made.

Who told you that? Kepler runs fine with TressFX. The blanket statements need to cover everything, otherwise they're just a false statements.


Which could easily nip them in the butt next gen, or the one after that, or the one after that, or however many cycles PCs go through between now and when next gen consoles EOL.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
So far we don't have any idea what enhancements they are planning and even if they will lockout features to Nvidia cards. If AMD do decide to go this route then it will be an despicable act IMHO. Nvidia have already went this route in the past with PhysX and cuda effects. AMD doing the same thing will simply polarise an already niche hobby even further.
 
Last edited:

Xarick

Golden Member
May 17, 2006
1,199
1
76
As long as there are no graphical issues, crashes or IQ sacrifices I don't see how this is bad.
The biggest issues isn't the GPU. It is the cpu.. if games start crashing on intel cpus due to optimized code.. then EA is in for a world of hurt.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Amazing to see how the two sets of zealots are reacting to this announcement.

So far we don't have any idea what enhancements they are planning and even if they will lockout features to Nvidia cards. If AMD do decide to go this route then it will be an despicable act IMHO. Nvidia have already went this route in the past with PhysX, AMD doing the same thing will simply polarise an already niche hobby even further.

AMD is committed to open standards and has nothing they can "lock out" Nvidia with but the Nvidia zealots will make you believe that is what is going on. The reality is that AMD is ensuring that Nvidia isn't allowed to interfere with making the experience the best for owners of AMD cards.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
As long as there are no graphical issues, crashes or IQ sacrifices I don't see how this is bad.
The biggest issues isn't the GPU. It is the cpu.. if games start crashing on intel cpus due to optimized code.. then EA is in for a world of hurt.

There is no reason that games will crash on Intel cpu's. For most purposes they act the same way as AMD cpu's do.

Optimizing for AMD cpu's simply means ensuring proper multi-threading for 8 cores, which is the same as the upcoming next-gen consoles. This is a long term strategy for AMD and after many years of hurt they are forcing the payoff they deserve.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
As long as there are no graphical issues, crashes or IQ sacrifices I don't see how this is bad.
The biggest issues isn't the GPU. It is the cpu.. if games start crashing on intel cpus due to optimized code.. then EA is in for a world of hurt.

I doubt this will be the case. CPU optimizations will be for AMD CPU's only. Basically better coding for their module based chips. ie: Heavily threaded and light on FPU calculations on the CPU side. Offload the FPU stuff to the GPU.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
My concern is they optimize the talk between amd gpu and cpu.. so if you have an AMD gpu and an intel CPU you end up with issues.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
My concern is they optimize the talk between amd gpu and cpu.. so if you have an AMD gpu and an intel CPU you end up with issues.

Well unless AMD has some sort of proprietary intermediate language, that wont be an issue. The communication between the CPU and GPU is not handled by the OS. Thats handled by the motherboard.
 
Feb 19, 2009
10,457
10
76
It was only a matter of time with all the chum in the water.


I have no problem with it, first and foremost I'm not running nVidia right now anyways. Secondly "optimized" is just marketing BS. Third I have little concern over the quality of nVidia's driver team. And lastly, it strikes me as highly unlikely that nVidia won't see the code, you'd have to be a moron to lock out 65% of the PC market's dedicated gamers making your game look terrible at release.

They'll see the code, except features will be exclusive for GCN, if you turn them on, they tank performance for NV hard (as it happened in a few titles so far). Its stupid to be honest, game developers need to make the game using non propriety code so it runs at 100% on every card. Theres a reason we have dx11.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't mind differentiation based on it promotes improvements in immersion, gaming experience potential, PC awareness and graphic fidelity based on IHV's commitment and vision for their customers.

Fracturing to me would be fundamental or foundational changes to game-play. These graphic fidelity enhancements by AMD or nVidia are not fracturing but offering differentiation for their customer bases and why developers don't mind improving fidelity over-all for their games.

I really believe there is way too much fear mongering for division and fracturing when all there are-are more choices for gamers for fidelity or optimizations. With both AMD and nVidia working hard promotes PC gaming as a whole.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
They'll see the code, except features will be exclusive for GCN, if you turn them on, they tank performance for NV hard (as it happened in a few titles so far). Its stupid to be honest, game developers need to make the game using non propriety code so it runs at 100% on every card. Theres a reason we have dx11.

I would put the blame on NV, honestly. If they would stop crippling Compute on their cards and in there drivers, they would have less problems.

Tell me, did you rail against NV, when they pushed for more Tess. performance in games, when AMD was weak in Tess.? Or did you blame AMD for being weak in Tess.?

So when AMD is strong in Compute, you are blaming AMD, for pushing for more Compute in games? Rather than blame NV for crippling their compute in drivers, just so an NV customer has to buy a Titan to be un-throttled in Compute?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I would put the blame on NV, honestly. If they would stop crippling Compute on their cards and in there drivers, they would have less problems.

Tell me, did you rail against NV, when they pushed for more Tess. performance in games, when AMD was weak in Tess.? Or did you blame AMD for being weak in Tess.?

So when AMD is strong in Compute, you are blaming AMD, for pushing for more Compute in games? Rather than blame NV for crippling their compute in drivers, just so an NV customer has to buy a Titan to be un-throttled in Compute?

Except Kepler isn't weak in compute, it's all single precision which it is not weak at.

It is weak in some AMD advised implementations due to how the code is run, kind of like how bitminers are just better than nVidia cards because of how to code/program works.

How this lie keeps getting perpetuated based on AMD designed implementations is beyond me, or how you can draw parallels between AMD not having the tessellator power and how code is written in a open source language.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Except Kepler isn't weak in compute, it's all single precision which it is not weak at.

It is weak in some AMD advised implementations due to how the code is run, kind of like how bitminers are just better than nVidia cards because of how to code/program works.

How this lie keeps getting perpetuated based on AMD designed implementations is beyond me, or how you can draw parallels between AMD not having the tessellator power and how code is written in a open source language.

Kepler (Minus GK110) *IS* weak at compute. A GTX680 is much slower than even a GTX580 in many compute benchmarks.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I doubt this will be the case. CPU optimizations will be for AMD CPU's only. Basically better coding for their module based chips. ie: Heavily threaded and light on FPU calculations on the CPU side. Offload the FPU stuff to the GPU.

and yet the 8 cores on the Jaguar CPU for the consoles is not like the modules in for the BD/PD CPUs on desktop

no, I'd wager the only "optimization" we'll see on the CPU side is simply better recognition and support for more processing threads, plain and simple, and thus the 8 threaded i7s will benefit just as much and that maybe the 4c/4t i5s start lagging behind Bulldozer/Piledriver.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
and yet the 8 cores on the Jaguar CPU for the consoles is not like the modules in for the BD/PD CPUs on desktop

no, I'd wager the only "optimization" we'll see on the CPU side is simply better recognition and support for more processing threads, plain and simple, and thus the 8 threaded i7s will benefit just as much and that maybe the 4c/4t i5s start lagging behind Bulldozer/Piledriver.

This is exactly what I expect for Battlefield 4. BF3 is already taking advantage of threads/cores, particularly in multiplayer. They'll leverage that even further with Battlefield 4, which will be awesome, heavily multi-threaded taking advantage of one of the few strengths of AMD's CPUs. Intel i7s will still do better and it will be a good thing all round to finally see heavy use of multi-threading.

Battlefield 4 has a much more robust and extensive physics system which could most likely take advantage of more CPU power to run it all. Dice is also on record as stating that Battlefield 4 will require 64 bit Windows 7 or 8 to run it, so it does look like they are taking advantage of everything they can this time out in making the client 64 bit only.

With the new consoles coming in with their 8 core chips, now is the time to push game developers to make use of all the untapped power CPUs have. Something AMD sorely needs considering their chips fall flat IPC wise against Intel's. Some heavily threaded games already show better performance on FX CPUs versus Intel i5s, so it makes sense for AMD to push for multithreading. I expect to see much more thorough use of it in BF4 than we did in BF3.

Battlefield 4 Alpha is already up and running, there may be some surprises in how it's making use of your CPU ;)
 

Spjut

Senior member
Apr 9, 2011
933
163
106
I hope that for example AVX will be supported in PC games now, since both the PS4 and Xbox One have CPUs supporting it

I'm definitely eager to see CPU performance for BF4 multiplayer, BF3 could be really punishing on Phenom II X4 and Core 2 Quad CPUs
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This is exactly what I expect for Battlefield 4. BF3 is already taking advantage of threads/cores, particularly in multiplayer. They'll leverage that even further with Battlefield 4, which will be awesome, heavily multi-threaded taking advantage of one of the few strengths of AMD's CPUs. Intel i7s will still do better and it will be a good thing all round to finally see heavy use of multi-threading.

Battlefield 4 has a much more robust and extensive physics system which could most likely take advantage of more CPU power to run it all. Dice is also on record as stating that Battlefield 4 will require 64 bit Windows 7 or 8 to run it, so it does look like they are taking advantage of everything they can this time out in making the client 64 bit only.

With the new consoles coming in with their 8 core chips, now is the time to push game developers to make use of all the untapped power CPUs have. Something AMD sorely needs considering their chips fall flat IPC wise against Intel's. Some heavily threaded games already show better performance on FX CPUs versus Intel i5s, so it makes sense for AMD to push for multithreading. I expect to see much more thorough use of it in BF4 than we did in BF3.

Battlefield 4 Alpha is already up and running, there may be some surprises in how it's making use of your CPU ;)

If there is anything that makes me need to upgrade my CPU, it will be the push for more threading. BF3 would use up to 6 threads. My current chip (In sig) will hit 95+% CPU usage in 64 player games (Even 4.5Ghz i5's will hit 90% CPU). So will most likely be time to move up to an FX assuming my mobo supports a Piledriver.
 

fixbsod

Senior member
Jan 25, 2012
415
0
0
a big bowl of yawn is all i see. all you people looking for conspiracies are just wanting to have another big amd vs nvidia thread. enough
 
Status
Not open for further replies.