Does GameWorks influences AMD's Cards game performance?

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gameworks, does it penalizes AMD cards?

  • Yes it defenitly does

  • No it's not a factor

  • AMD's fault due to poor Dev relations

  • Game Evolved does just the same


Results are only viewable after voting.
Status
Not open for further replies.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
This has to be one of the worse examples of GameWorks + crap developer combo so far.
It's only going to get worse. AMD has no choice but to fight fire with fire they need to leverage GCN across platforms and make sure as many games as possible run horribly on Nvidia hardware.

Very sad it has come to this but myself and quite a few others saw this coming years ago.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's only going to get worse. AMD has no choice but to fight fire with fire they need to leverage GCN across platforms and make sure as many games as possible run horribly on Nvidia hardware.

Very sad it has come to this but myself and quite a few others saw this coming years ago.

We need people not to buy into this type of practice. That's the only way it will stop. AMD is already all in on open source. They would have difficulty going back now. Look at the flack people gave them because they were developing Mantle in a closed environment. They don't mind though when nVidia develops everything in a closed environment and don't even open it up after they are done.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
AMD is already all in on open source. They would have difficulty going back now. Look at the flack people gave them because they were developing Mantle in a closed environment.
People that did that are hypocrites basically everything Nvidia does is closed off unless they have no other choice.
 
Feb 19, 2009
10,457
10
76
We need people not to buy into this type of practice. That's the only way it will stop. AMD is already all in on open source. They would have difficulty going back now. Look at the flack people gave them because they were developing Mantle in a closed environment. They don't mind though when nVidia develops everything in a closed environment and don't even open it up after they are done.

If they don't fight fire with fire, they are gonna be burnt to the ground.

I can give you a prediction all the way to Pascal vs Artic Island. AMD makes competitive GPU vs NV. Performs horribly in GameWorks title. Gamers blame AMD on poor drivers, don't buy their stuff.

Just go and read the Steam forum, Neogaf, [H] etc on Project Cars. Many NV users there use that as justification of why they never buy AMD, because their drivers suck in so many games.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
HD4870, first GDDR5 Graphics Card and best performance/price GPU in its segment making NVIDIA to price cut GTX280 within its first week of release by $250.

HD5870, first 40nm High-End GPU and fastest Single Chip GPU for more than 6 months.

HD5970. Fastest Graphics Card for more than 18 months.

HD7970. First 28nm GPU and performance leader for 3 months.

HD7970GE. Faster GPU for more than 6 Months. (until Titan release)

R9 290X same performance at $200 lower than GTX780Ti on release. Even today 18 months after its release it has the best Performance/price in its segment and it is faster in many many games vs the GTX780Ti.

Even in economics, the GPU group always produced profit per year from 2006 onwards.

Not sure what you reply to. Because its completely out of context. Its about software support if you want a hint.

Remember stuff like this?
AMD_game_physics_strategy_2010.jpg
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
They said this also


So it's the drivers fault PhysX is taking up so much CPU? OK. :rolleyes:

man that really is annoying. Seems AMD drivers are just easy to blame since the idea has been hammered into people. They'd blame global warming on AMD drivers if they needed to.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
I don't know, that would be sneaky and disingenuous to have GPU PhysX calls and force AMD to use only the CPU --- without an enable or disable setting. I would have trouble believing this.

They can't disable it and have the same game. They based the whole thing on physx.

The problem mentioned on the forums about what could happen if gameworks is a much more significant part of a game is already reality I guess. Only a matter of time before nvidia is being investigated
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Maybe a third party reviewer may take the task of an in-depth investigation and interviews with nVidia, developer and AMD.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Maybe a third party reviewer may take the task of an in-depth investigation and interviews with nVidia, developer and AMD.

What? Ask the dev if he's sure that GPU PhysX is unloaded on the CPU when no nVidia card is detected? Or if he just made it up? Or if he's lying?
 
Feb 19, 2009
10,457
10
76
According to the dev, the current fix it to lower settings that reduces CPU load, such as shadow maps & details. Also, gamers on AMD say the smoke from tires during burnouts cause their fps to tank into single digits. GPU PhysX particles running on CPU would explain that one.

I'm waiting for BrentJ to do a nice review on it where he blasts AMD again for not optimizing newly released games.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What? Ask the dev if he's sure that GPU PhysX is unloaded on the CPU when no nVidia card is detected? Or if he just made it up? Or if he's lying?

As lilltesaito offered and hits home:


lilltesaito said:
So much mismatch info out there.

An in-depth hardware investigation and some interviews from nVidia, developer and Amd would offer some clarity and what the parties mind-sets are.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
So far the way I see it right now is that Dev is 20% of the problem AMD is another 20% and Nvidia is 60%.
Dev are letting Nvidia walk all over them and AMD is not pushing hard enough to make sure things are better for them before games come out. Nvidia is just using this to the full advantage as they can. I think it would be very hard to prove that they are doing this to cripple AMD(which most likely they are).
Reading all these post on the internet, just looks Nvidia is just adding stuff to make the game better.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wow, this is turning out to be much worse than I actually thought. If what the developer is stating is true, it would be the first real proof that a PC game was developed from the ground up to favour 1 AIB over everyone else, essentially throwing Intel/AMD GPU users under the bus. If the developer knew from the get go that using PhysX as the foundation for their entire physics engine meant that all of these calculations could be offloaded to NV GPU, how did this developer expect gamers with Intel/AMD GPUs to play the game? Sounds like the developer was short on funds and it was probably more cost effective to use NV's pre-baked PhysX than to create your own complex physics engine. Almost shocking to believe a developer would go ahead with this idea knowing the consequences.

"The game runs PhysX version 3.2.4.1. It is a CPU based PhysX. Some features of it can be offloaded onto Nvidia GPUs. Naturally AMD can't do this.

In Project Cars, PhysX is the main component that the game engine is built around. There is no "On / Off" switch as it is integrated into every calculation that the game engine performs. It does 600 calculations per second to create the best feeling of control in the game. The grip of the tires is determined by the amount of tire patch on the road. So it matters if your car is leaning going into a curve as you will have less tire patch on the ground and subsequently spin out. Most of the other racers on the market have much less robust physics engines."


This definitely explains a big part of the AMD performance anomaly whereupon R9 280X and R9 290/290X hardly scaled relative to each other - the worst scaling I've ever seen in any game. Actually from every video I've seen online, the GPU utilization for AMD cards is nowhere near 100%, which means the GPU is basically waiting for the CPU all the time.

It's also interesting to note how moving from an FX8370 + 980 to a 4770 + 980 combo hardly improves the performance, but when the same is done with R9 290X, the performance skyrockets. It's as if the R9 290X is 100% CPU limited.

FX8370+980 = 54.5 fps
4770K+980 = 62.9 fps (+15%)

vs.

FX8370+290X = 23 fps
4770K+290X = 37.4 fps (+63%)

And it would then make sense why AMD is getting a performance boost in Windows 10 with the same driver because there is less CPU overhead which means the CPU is able to perform more calculations for PhysX.

"compared to Windows 8.1 with the Catalyst 15.4 Windows 10 works by as much as 20 to 25 percent faster! "
http://www.computerbase.de/2015-05/project-cars-guide-grafikkarte-prozessor-vergleich/3/

There are 2 big questions I still have then:

1) If PhysX is being offloaded to the NV GPU, how in the world is a 960 OC within 6% of the 780Ti? Do we have any proof that PhysX runs much much faster on Maxwell vs. Kepler? This would be interesting to gauge.

2) If PhysX underlies the entire physics engine of the game and isn't an "On/Off" switch, then how did the developer manage to get PS4 to hit 40-60 fps at 1080P? Did they remove PhysX and the console version is different in feel from the PC version or what?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
2) If PhysX underlies the entire physics engine of the game and isn't an "On/Off" switch, then how did the developer manage to get PS4 to hit 40-60 fps at 1080P? Did they remove PhysX and the console version is different in feel from the PC version or what?
My educated guess would be the code path for the PS4 is well optimized but on the PC side the PhysX stuff on the CPU is far from it.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
With all these crazy theories going around...

How is AMD getting such good performance in Win10 compared to Win8.1 and below?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wouldn't it be more accurate to blame global warming on the R9 290X and FX-9590?

Off-topic reply:

Hehe, only 'fake' environmentalists use perf/watt of CPUs and GPUs as some 'fashionable' justification to save the environment. Most people who are actually serious about the topic would laugh at that notion because the effect is basically immaterial. You would save more by air drying your families' laundry, literally, or selling that 5-year-old V8 Ford F150, etc. Do you know how many muscle cars are purchased in America annually (Mustang, Challenger, Camaro)?

Anyone dead serious about actually saving the environment in developed countries can help right away, starting tomorrow by buying LESS food (i.e., because a lot of excess/unused food we buy we throw out, which then goes to landfills) and doing small things like buying groceries with paper bags / brings his/her own bag, instead of plastic because plastics take 100 years+ to degrade in landfills.

"According to the Natural Resources Defense Council, about 40 percent of all edible food is thrown away in the United States. Supermarkets lose $15 billion annually in unsold fruits and vegetables alone, while restaurants throw out around 10 percent of the food they purchase, contributing to one-fifth of all food that ends up in landfills.

According to the EPA, food waste has increased by 50 percent since the 1970s and is now the largest solid waste contributor to landfills. As your dinner remnants sit with 31 million tons of other Americans’ unfinished meals in landfills across the country, they produce methane — a gas with 25 times the global warming potential of carbon dioxide. It’s estimated that eliminating food waste would have the same impact on greenhouse gas emissions as taking a quarter of all cars in America off the road."

While the latter might seem a daunting and overwhelming task, eliminating food waste is something that everyone, from individuals to food service companies and restaurants, can be a part of. A recent U.K. survey found that 80 percent of customers want businesses to tackle food waste, and companies are responding by showing more interest and dedication to exploring solutions to the issue. For example, many companies such as Unilever and General Mills have incorporated waste reduction or “zero waste” goals into their long-term targets. They show wisdom in doing so considering consumer sentiment, financial and environmental impacts and emerging regulatory measures. (Boston for example announced a plan to ban commercial food waste last summer.)
"
http://www.triplepundit.com/2014/05/waste-want-reducing-food-waste-can-help-address-climate-change/

Anyone actually dead serious about being an environmentalist isn't sitting there researching which CPU/GPU is better on a perf/watt scale but is proactively helping reduce food waste, actively recycles products, buys groceries with own bags or at least uses paper bags whenever possibly, walks to a store 10-20 min away instead of driving, buys solar panels for the roof, etc.


==============

Whoever is interested in saving the environment needs to take this topic A LOT more seriously than perf/watt.

"Getting food from the farm to our fork eats up 10 percent of the total U.S. energy budget, uses 50 percent of U.S.
land, and swallows 80 percent of all freshwater consumed in the United States. Yet, 40 percent of food in the
United States today goes uneaten. This not only means that Americans are throwing out the equivalent of $165
billion each year
, but also that the uneaten food ends up rotting in landfills as the single largest component of U.S.
municipal solid waste where it accounts for a large portion of U.S. methane emissions. Reducing food losses by
just 15 percent would be enough food to feed more than 25 million Americans every year at a time when one in
six Americans lack a secure supply of food to their tables."

http://www.nrdc.org/food/files/wasted-food-ip.pdf

That's why perf/watt is an ingenious marketing PR to get people to upgrade to latest tech (because today it's easier to sell perf/watt given that node increases are harder and performance increases on the CPU side are at a snail pace) because they "feel good about themselves by buying a more 'efficient product' by association that it helps to save the environment." The irony is they don't even think about food waste. Some food for thought.

/ Off-topic reply.
 
Last edited:
Status
Not open for further replies.