just cause 3
wai you no have sli in 2015?
does crossfire work?
Better question: Why is the 290X below the 970 in that graph when the 290X is faster?
just cause 3
wai you no have sli in 2015?
does crossfire work?
Better question: Why is the 290X below the 970 in that graph when the 290X is faster?
Better question: Why is the 290X below the 970 in that graph when the 290X is faster?
They offer sweet features in a unideal multi-platform world.They may not be the sweetest or ideal, but some don't allow idealism to be the enemy of good.GameWorks effects are like Baskin-Robbins or Dove. It's better than your average ice cream/chocolate but it's not Gelato or Godiva.
Open-source development means you are required to spend more $, resources and put in-house talent/brains to use that's required to make true next gen games from the ground-up.
considering no GWs game of 2015 (or ever) looks as good as SW:BF - an open source game.
Is there special features you can enable on AMD cards like hairworks?
TressFX Hair was certainly impressive from a visual perspective, but less discussed is the operational efficiency that compels appreciation on both technical and philosophical grounds. To put a fine point on that, we wanted to illustrate the actual performance impact of AMD’s TressFX Hair contrasted against NVIDIA’s Hairworks.
In the below diagram, we isolated the specific routine that renders these competing hair technologies and plotted the results. The bars indicate the portion of time required, in milliseconds, to render the hair from start to finish within one frame of a user’s total framerate. In this scenario, a lower bar is better as that demonstrates quicker time to completion through more efficient code.
In the diagram, you can see that TressFX Hair exhibits an identically low performance impact on both AMD and NVIDIA hardware at just five milliseconds. Our belief in “doing the work for everyone” with open and modifiable code allowed Tomb Raider’s developer to achieve an efficient implementation regardless of the gamer’s hardware.
In contrast, NVIDIA’s Hairworks technology is seven times slower on AMD hardware with no obvious route to achieve cross-vendor optimizations as enabled by open access to TressFX source. As the code for Hairworks cannot be downloaded, analyzed or modified, developers and enthusiasts alike must suffer through unacceptably poor performance on a significant chunk of the industry’s graphics hardware.
With TressFX Hair, the value of openly-shared game code is clear.
There's a big difference, actually: Project Cars is unplayable on most AMD cards, while this is at least playable on the cards it should be playable on. This game is clearly extremely unbalanced, but not in the same way as GameWorks titles. You guys are just ignoring the actual problem with GameWorks titles.
So, AMD cards are not as good as nVidia's and you blame the game? This doesnt make sense.
A GTX970 with OC (around 1250MHz) gets 35FPS in 4K with 20 CPU components and dry conditions in Project Cars: http://www.computerbase.de/2015-05/...diagramm-grafikkarten-benchmarks-in-3840-2160
The same card gets the same frames in Dirt:Rally with only ony car on the track (GTX970 is not overclocked here, so you can add ~15%): http://www.computerbase.de/2015-12/dirt-rally-benchmarks/2/#diagramm-dirt-rally-3840-2160
With 20 other cars on the track Dirt:Rally wouldnt even be close to be playable on a nVidia card.
BTW: The difference between the GTX970 and 290X in 4K is exact the same in both games:
Project Cars: GTX970 33% faster
Dirt:Rally: 290X 33% faster
Gameworks features make all cards perform slow and aren't worth the performance hit.
So the answer to my question is "current cards are not fast enough to handle gameworks"?
That's what you guys are pissed about?
See, this would be a good point if the 970 and 290X performed exactly the same at 4k (which, again, is irrelevant for these cards anyway) in a typical scenario. However, that's not the case; the 290X is normally 13% faster than the 970 at 4k:
So, more accurately:
Dirt: Rally - 290X gains 20% over norm
Project Cars - 970 gains 46% over norm
Your point only really makes sense if Project Cars is the most CPU-intensive game ever released for reason not related to the forced heavy CPU PhysX, really.
And better not to start to talk about consoles who ended to be a dissaster and literally destroyed the game..sontin , here
Project card , 1920x1080 :
R9 290x = 42 fps
Geforce GTX 970 = 74.2 fps
74.2/42.2 = 76% ( GTX 970 is 76% faster than R9 290x)
Dirt rally , 1920x1080 :
AMD R9 290x = 92.6 fps
Geforce GTX 970 = 80.4 fps
92.6/80.4 = 15% ( R9 290x is 15% faster than GTX 970)
This is huge different.
This benchmark suite doesnt include any racing game. Using it to make a statement about a racing game is misleading.
If we use just Dirt:Rally and Project Cars then the summary would be: There is no difference between nVidia and AMD in the performance. If you include other games like Formula 1 2015 and Assetto Corsa the average performance would be the same but Dirt:Rally and Project Cars are showing more and more a bias towards one company.
In the end there is no difference between Dirt:Rally and Project Cars.
Project Cars eats CPUs. With ultra settings it generates a huge amount of draw calls when you have other cars on the track.
You're trying to play with words. The issue is that the features are purposefully poorly optimized (evidenced be the fact that they often completely destroy frame times and minimum frame rates) and make the main game more difficult to optimize in many cases, and the only party that really benefits in the end is Nvidia because people are forced to buy their flagship every year to game at 1080p with better than console graphics. If they wanted to, they could make the features run better across more hardware; however, they don't. GameWorks exists for no other reason than for Nvidia's planned obsolescence, and it takes advantage of the fact that some devs are lazy and can use those features as a selling point. These are facts. I'm not going to explain it again, so if you're going to continue with the willful ignorance just don't reply.
Are these "facts" or your inferences and opinion? Do you have documented proof of these accusations? You are certainly entitled to your opinions, but I dont see any proof that they are "facts". You can certainly give framerates on various games, but personally, I dont have an inside contact in nVidia to tell me their motivations. Do you?
Ill have to side with techhog since his post are insightful and just makes sense. But correct me if I'm wrong, we all here to be informed right? But to my understanding game-works is a crap shoot for the major reason of it degrading performance with minimal increase in graphics fidelity. Gamework it seems was developed mainly to help lazy developers (rather work along side said people to make sure its properly optimized they just injected in the games code with no care) and to hurt the competition. But my question is why add such features when only the two top cards of this architecture (maxwell 2) can run them while giving a somewhat decent frame rate and even then performance tanks? To me if a feature that is marketed as next gen takes me from 60 to 45 fps with no discernible difference in IQ while also tanking performance of card that was also marketed as supporting said features it needs to be dropped plain and simple.
I tried dying light demo for the first time last night and my conclusion is game works sucks the two features that tanked performance of the game was the ones that had nvidia's name beside it. With those features turned off I went from 35-ish minimums to 60 fps at minimum. Granted their will be areas that will cause a drop in fps lower than 60 the performance gain was quite high, and the game did not look much different or I'm just blind. This was at 1440p, stock 290, map size I think was a notch down all other settings were maxed for the demo.
So what's your plan?