AMD Announces their GameWorks Equivalent

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lilltesaito

Member
Aug 3, 2010
110
0
0
Since others haven't implemented it there must be. Then the reason is pure semantics. But its there. Just like we use Windows and OSX on our desktops instead of (still free) Linux version. Because its not as simple as we sometimes think.

Well it seems like it is going to be used by someone else.

Intel Throws Its Support Behind AMD FreeSync-Style Display Technology

From http://www.forbes.com/sites/jasonevangelho/2015/08/21/intel-throws-its-support-behind-amd-freesync-style-display-technology/
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Since others haven't implemented it there must be. Then the reason is pure semantics. But its there. Just like we use Windows and OSX on our desktops instead of (still free) Linux version. Because its not as simple as we sometimes think.

You're totally missing the point. How can you miss a simple point as open vs closed? I don't get it.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
Also what's the point for AOFX aka HDAO? Enabled HDAO looks same as turned off but kills performance... Nvidia's of course.

Yeah it sure does look the same..if you are blind. Sorry but I don't care if NV or AMD looks better or whatever but this is just proofs you are not being objective.

mX3DEZM.jpg
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Your asking for the nigh impossible. Blocking GameWorks ? Easier said than done when AMD is bleeding money and the best defense is the best offense in this case when the only practical way to be compete is to also adopt dirty tactics like your competitor does ...

Nigh impossible? It only takes a good few games to change the wave of "perception" in regards to reviews day 1. Instead of the "Did you see Fallout 4 results?" it would be "good to see Fallout 4 runs even on both cards." Replace Fallout 4 with a few of the major AAA titles of this year alone. Look at the fanfare this forum gushes for SW:BF, and now more recently Dirty: Rally.

A good offense isn't going to do jack when NV is paying/incentives devs to put code into games that basically cripples AMD - regardless of that good offense. If people think this won't continue into the DX12 era, they aren't paying enough attention.

Despite your protests about GameWorks or Nvidia's integrity it didn't stop you from switching over to green team so why not allow AMD to do the same to give them a fighting chance ?

Are you not reading what I wrote? I've been asking for AMD to be more aggressive for years. I loved it when they pushed Gaming Evolved back in 2012-2013. I was promoting their attempts to secure good marketing. Then what happened? They fell asleep at the wheel. Beside waiting for the DX12 revolution to save them, they've basically let Nvidia walk all over them.

Why do you think I switched sides? AMD had no reservations in jacking up the price to compete with Nvidia 1:1. I waited for Fury X, and when it launched for $650, switching sides was a no brainer.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Nigh impossible? It only takes a good few games to change the wave of "perception" in regards to reviews day 1. Instead of the "Did you see Fallout 4 results?" it would be "good to see Fallout 4 runs even on both cards." Replace Fallout 4 with a few of the major AAA titles of this year alone. Look at the fanfare this forum gushes for SW:BF, and now more recently Dirty: Rally.

A good offense isn't going to do jack when NV is paying/incentives devs to put code into games that basically cripples AMD - regardless of that good offense. If people think this won't continue into the DX12 era, they aren't paying enough attention.

Just how exactly would AMD prevent developers from GameWorks when Nvidia is paying them to do so ? We're left with the same problem we started with and in the end AMD gets the short end of the stick ...

Are you not reading what I wrote? I've been asking for AMD to be more aggressive for years. I loved it when they pushed Gaming Evolved back in 2012-2013. I was promoting their attempts to secure good marketing. Then what happened? They fell asleep at the wheel. Beside waiting for the DX12 revolution to save them, they've basically let Nvidia walk all over them.

Why do you think I switched sides? AMD had no reservations in jacking up the price to compete with Nvidia 1:1. I waited for Fury X, and when it launched for $650, switching sides was a no brainer.

I did but you have yet to interpret what I meant and it's that AMD should push out an actual GameWorks equivalent. That means gimping Nvidia GPUs like there's no tomorrow and making it a nightmare for Nvidia to optimize ...

At this point performance is the only way for them to get their consumers attention even if it means making it run like a slideshow for their competitors ...
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Just how exactly would AMD prevent developers from GameWorks when Nvidia is paying them to do so ? We're left with the same problem we started with and in the end AMD gets the short end of the stick ...

Unfortunately, I don't know enough of AMD's inner workings to answer that. All I know is 2-3 years ago, they had no issues courting some devs. Hell, some of those relationships are still going today (Eidos/DICE).

From what I've been seeing of NV lately, if AMD doesn't put some bait on their lure, they're going to get shafted.


I did but you have yet to interpret what I meant and it's that AMD should push out an actual GameWorks equivalent. That means gimping Nvidia GPUs like there's no tomorrow and making it a nightmare for Nvidia to optimize ...

I didn't say I was against that. I'm all for AMD being aggressive. The reason I didn't word my post to that level is because people would be against it, and I understand why too. But if AMD can't even get the resources to court devs, I doubt they can muster the code to fubar Nvidia.

At this point performance is the only way for them to get their consumers attention even if it means making it run like a slideshow for their competitors ...

What good would that do them if it's titles that either A) bomb or B) get bad worth of mouth (Thief anyone? Or, Murder Suspect - do you even remember that one?). Or them leaving their partners out to dry (why didn't AMD continue support with Square on FFXIV, considering they got Eidos/CD in bed with them).

So far it seems the most lucrative series to come from their partnership has been DICE games.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
1) How this is Nvidia's fault if WB refused?
2) Tessellation isn't part of Gameworks.
3) How this is Nvidia's fault if AMD introduced tessellation with world's first DX11 GPUs. Technically Nvidia using AMD's tech.

1) It was WB fault. Signing a deal that doesn't allow to optimize a game is a stupid thing. Nvidia just building a business around the lack of optimization. It makes sense for them.
2) Tessellation is the part of GeometryWorks.
3) AMD provide a same kind of SDK. It's called silhouette tessellation. Nearly five times faster than GeometryWorks with the same quality.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
We have to see. Everyone tend to back everything as long as it doesn't cost them anything. HSA is another example. The single point that Intel wont even commit to a timeline or in any other way mention when support may come means its still a wait and see game.
Intel is working very hard to modify the display engine in Kaby Lake to support A-sync. At the end of next year there will be nearly 100 Freesync ready monitors in the market, and these will probably work with Intel's implementation.
Also Nvidia have to do something, because at some point G-sync will be a handicap. ASUS won't be able to build dozens of monitors every year to counter the Freesync fleet.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Which of those images is the best?

Hard question. SSAO and HBAO clearly create too much false shadow. Probably because these effect calculated on quarter resolution. HDAO in the other hand create too less. Maybe undersampling?
It is hard to optimize these SSAO algorithms, because there might be some scenarios where the results will be bad. This is why I hate SSAO in general. Obsurance fields do a better job nowadays, and this isn't a screen space effect.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Or they are just using nVidia's Gamework libaries which are unoptimized for ALL GPU hardware.

Since others haven't implemented it there must be. Then the reason is pure semantics. But its there. Just like we use Windows and OSX on our desktops instead of (still free) Linux version. Because its not as simple as we sometimes think.

Stop spreading misinformation and flat out fabricated data. The main reason Intel hasn't supported FreeSync is because companies that size aren't flexible to adopt the latest cutting edge standard for projects/product designs that were taking place 2-3 years in the pipeline.

"Now, Intel has thrown its own hat into the ring and announced that it intends to support the VESA Adaptive-Sync standard over the long term."
http://www.extremetech.com/gaming/212642-intel-will-support-freesync-standard-with-future-gpus

Fact of the matter is nothing stops Intel or NV to get FreeSync support incorporated into their products. NV will hold out by milking its loyal consumers for all its worth until market conditions force them to start supporting it. The only way NV won't support it is if AMD goes bankrupt or if NV maintains 80% market share indefinitely.

AMD needs to block Gameworks, otherwise Nvidia is gonna continue to get AAA titles under its belt.

Ya, what does that mean? You think AMD can just come to Ubisoft and tell them to drop NV as a GameWorks partner when NV sends them engineers and pays for co-marketing of their games? What do you think it's as easy as picking up the phone, scheduling a meeting with Ubisoft executives and telling them to use open source?

Nvidia Gameworks 2015

Dying light -- horribly optimized at launch, with single CPU core being pegged to 100%, massive performance drops with draw distance despite almost no IQ differences.

Fallout 4 - what needs to be said about this one? Outdated graphics, unoptimized on CPU + GPU side + memory bandwidth side. Glitchy out of the wazoo. Lighting Godrays that are unnecessarily demanding but the game looks worse than Crysis Warhead, 2008 game.

MGS V - major issues with multi-GPU support, but about the only game in this list that was decent

Witcher 3 - works great without HairWorks, meaning the main GameWorks feature in this game is a FAIL on all GPUs. Performance degradation is massive due to excessive and pointless over-tessellation <-- proven to be wasteful via screenshots and reduction of tessellation factor to lower levels on AMD cards.

Project Cars - one of the worst optimized games of 2015. When this game launched, GTX960 OC was almost as fast as GTX780Ti. UBER FAIL.

Just Cause 3 - game aimed at consoles with outdated graphics and GameWorks effects that hardly matter. Essentially GameWorks did nothing to make this game look better than consoles. The main differences per digital foundries are the inclusion of heat haze and slightly more details. Basically a console game on the PC.

Assassins Creed Syndicate - Oh wow, the epitome of GamesDon'tWork. Looks worse than Unity too which means they had to downgrade graphical details, number of NPCs on screen and that's what they called better optimized than Unity. How is SLI working in this game? :sneaky:

Rainbow Siege Six - Outdated graphics, GameWorks features look terrible. Game was clearly made with consoles in mind. Graphics are miles worse than even Black Ops 3.

Killing Floor 2 - Oh you found 1 game that runs well on low end hardware but then again its graphics aren't anything special so it should.

Evolve - completely broken CPU optimization, but worst of all, anyone who bought this game supports $60 of DLC. I wouldn't buy this game for $1 because I do not support such business practices. This post sums this game up:

"No offense to anyone, but you'd have to be one hell of an absolute idiot to spend that much money on this, or any other game."

Evolve - one of the most hyped games of 2015 that today no one cares about.

Batman Arkham Knight - worst PC port of 2015 hands down, broken in all areas, flat out bombs on 2GB GPUs. Maybe the worst console to PC port of all time.

AMD Gaming Evolve 2015

Battlefield Hardline -- amazing optimization
Star Wars - best looking overall + best optimized FPS game of 2015
Dirt Rally -- best optimized racing game on the PC in years

Thanks for proving that GameWorks has been a total failure in 2015. Whether or not GameWorks itself ruined some of these games or the developers are just inept at coding is debating on a per game/per use basis but nearly every GameWorks title of 2015 was broken and unoptimized.

You forgot Anno 2205 - another unoptimized turd where a "magic" patch improved performance 15-20% post launch -- i.e., the developer rushed the game to market in an unoptimized state.

People in the gaming industry (aka Devs) like to share solutions to problems, new rendering techniques and other information.

Your view of software seems a little bit outdated. Even Microsoft is open sourcing more and more stuff ... D:

Haven't seen a solid response yet as to what developers who don't use open-source next gen effects are going to do for their next games? Call NV again cuz they do not want to spend the $, or know how to make next gen graphics with open-source standards? :D

AMD is always wrong!

Newsflash: Some posters in this very thread who defend closed-source black box DLLs/middle-ware have a track record of crapping anything AMD, missing out on thousands or tens of thousands of dollars via bitcoin mining (because the thought of owning AMD hardware that makes $ is worse than paying $500-550 for mid-range NV cards every 2 years), constantly buying overpriced flagship NV cards and use the halo status of the top card to ignore how NV gets destroyed in all other pricing segments gen after gen, ignore when older NV cards fall apart (read: Kepler), ignore price/performance, etc. Basically, if AMD beat their favourite brand in every metric and cost $100, they'd pay to $500 to own NV. Remember, these are the same people who owned NV when it had terrible 2D and 3D IQ (all the eras leading up to Fermi), when NV had Full RGB broken over HDMI, owned the god awful GeForce 5, owned NV even when ATI smoked them with 9800/X800/X1900 series. Same hypocrites that spout perf/watt while buying NV during GTX200/400/500 series. :sneaky:

...Most of them also haven't owned an ATI/AMD card in a decade or maybe ever so their opinion on AMD vs. NV GPUs is basically worthless since they were never objective to start with and never have any merits to compare their experiences with different brands. I'd say that anyone who since 2012 sites perf/watt as a key metric, who consecutively owned GeForce 5, 6, 7, GTX200 and Fermi should automatically be disqualified from all objective GPU discussions/GPU recommendation threads. I've probably only met 2 PC gamers who satisfy this and remain objective because they also bought AMD cards for secondary rigs (objective as in for their next GPU purchase they actually consider AMD vs. NV as if it's an all new purchase, not blindly decide that their next card is NV 100%, no matter what). I would bet that almost anyone who falls into this category would not buy an AMD card even if it were 100% faster and cost 1/2 the price. My guess is some people in this thread defending NV's GWs fall exactly into this category of PC users.
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Unfortunately, I don't know enough of AMD's inner workings to answer that. All I know is 2-3 years ago, they had no issues courting some devs. Hell, some of those relationships are still going today (Eidos/DICE).

From what I've been seeing of NV lately, if AMD doesn't put some bait on their lure, they're going to get shafted.

I didn't say I was against that. I'm all for AMD being aggressive. The reason I didn't word my post to that level is because people would be against it, and I understand why too. But if AMD can't even get the resources to court devs, I doubt they can muster the code to fubar Nvidia.

What good would that do them if it's titles that either A) bomb or B) get bad worth of mouth (Thief anyone? Or, Murder Suspect - do you even remember that one?). Or them leaving their partners out to dry (why didn't AMD continue support with Square on FFXIV, considering they got Eidos/CD in bed with them).

So far it seems the most lucrative series to come from their partnership has been DICE games.

That's because their having money issues ...

They are probably praying to the console developers to use asynchronous compute shaders on PC by now ...

Who cares if a game gets a bad word of mouth, sales of AAA games on PC will tell the real story so overall you get half a point about the games bombing ...

It should be noted that SE =/= Eidos ...

Square Enix in Japan prefers Nvidia and AMD probably doesn't care too much about sponsoring old games when we all know that the incentive to buying new hardware is so that it can perform well on newer games ...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Square Enix in Japan prefers Nvidia and AMD probably doesn't care too much about sponsoring old games when we all know that the incentive to buying new hardware is so that it can perform well on newer games ...

Square Enix Japan is a different beast entirely. "Final Fantasy XIII was locked at a 1280x720 rendering resolution, with no graphical options at all to speak of"

Do you know how broken/unoptimized their Final Fantasy XIII and XIII-2 ports were?
http://www.polygon.com/2014/11/11/7195223/final-fantasy-13-pc-port-graphics-options-update

I think XIII-2 is may still have stuttering issues. For the 3rd installment of the series, they decided to take nearly a year to fix the PC port. I think Square Enix Japan's first goal should be to even attempt to do great PC ports, nevermind trying to push the envelope with next gen graphics on the PC.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Ya, what does that mean? You think AMD can just come to Ubisoft and tell them to drop NV as a GameWorks partner when NV sends them engineers and pays for co-marketing of their games? What do you think it's as easy as picking up the phone, scheduling a meeting with Ubisoft executives and telling them to use open source?

TIL: Ubisoft is the only other dev.


That's because their having money issues ...

Well aware of that. Which is why I said they should attempt to court devs before trying to spawn some kind of equivalent of Gameworks.

They are probably praying to the console developers to use asynchronous compute shaders on PC by now ...

Even if they do, if the AOTS situation is true, we've seen devs take it out to satisfy Nvidia.

Who cares if a game gets a bad word of mouth, sales of AAA games on PC will tell the real story so overall you get half a point about the games bombing ...

Exactly, AAA titles! Which is what I said they should attempt to wrangle in. They got Bioshock: Infinite and FFXIV in 2013 (whether this place likes MMOs are not, that is a big win).

It should be noted that SE =/= Eidos ...

Square Enix in Japan prefers Nvidia and AMD probably doesn't care too much about sponsoring old games when we all know that the incentive to buying new hardware is so that it can perform well on newer games ...

FFXIV was partnered with AMD for the US launch. Now FFXIV: Heaven's Ward has Gameworks and is an NV title.

That bold is very true. And recently, it seems AMD has been causing people NOT to upgrade. Which is only going to bite them in the ass when the time to upgrade does come around and NV is leading the way.


Right now, I'm aware that AMD is pretty boned. What aggravates me the most is what the hell happened in 2014. It's like they stopped trying. They thought they could ride the tails of the bitemine craze or something. When that well ran dry, the used market destroyed their new cards sales.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Well aware of that. Which is why I said they should attempt to court devs before trying to spawn some kind of equivalent of Gameworks.

Even if they do, if the AOTS situation is true, we've seen devs take it out to satisfy Nvidia.

Exactly, AAA titles! Which is what I said they should attempt to wrangle in. They got Bioshock: Infinite and FFXIV in 2013 (whether this place likes MMOs are not, that is a big win).

FFXIV was partnered with AMD for the US launch. Now FFXIV: Heaven's Ward has Gameworks and is an NV title.

That bold is very true. And recently, it seems AMD has been causing people NOT to upgrade. Which is only going to bite them in the ass when the time to upgrade does come around and NV is leading the way.

Right now, I'm aware that AMD is pretty boned. What aggravates me the most is what the hell happened in 2014. It's like they stopped trying. They thought they could ride the tails of the bitemine craze or something. When that well ran dry, the used market destroyed their new cards sales.

Court devs for what exactly ? For sponsorship ? What are the devs going to do without the tools to promote AMD ? Preparations should come first ...

AMD still wins the battle in the end if they can get devs to use async compute even if it means not been able to undermine their competitors performance so they could either take or leave it but I'm willing to bet that taking anything than nothing is the smarter idea ...

They are attempting to get AAA titles but it's not like they have any luck when Ubisoft, Activision, and others have already made a deal with Nvidia ...

FFXIV's programmers are mainly located in Japan and it doesn't matter where the game releases when you localize them ...

Stopped trying when their engineers are very hard at work ? You could say it has been bust so far for AMD in 2015 but at least they launched Fiji even though it was a fold for them. Maybe AMD will have better luck next year with GPUOpen, DX12 and possibly their new GPU microarchitecture ? Who knows, they maybe on the slower side but I wouldn't say AMD has stopped trying ...
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Does anyone else find it EXTREMELY odd that AMD has this much support for Freesync?
Especially given their GPU marketshare?

Not that I'm complaining, I'm happy SOMEONE got support for this technology (seriously Nvidia? The amount of display options you have is pitiful. Just utterly pitiful. Give an actual amount of Gsync options -.-).

It looks like next year, there will be a ridiculous amount of Freesync options, and only MORE options from import monitors. Really, I think AMD's LARGEST mistake has been not pushing Freesync enough and "Smooth gaming" enough.

Where AMD shines is the cost to get smooth gaming. Freesync + AMD is cheaper than Gsync + Nvidia, and for many gamers, a smooth gaming experience is what they want and that's the best way to do so.
 

sandorski

No Lifer
Oct 10, 1999
70,677
6,250
126
Does anyone else find it EXTREMELY odd that AMD has this much support for Freesync?
Especially given their GPU marketshare?

Not that I'm complaining, I'm happy SOMEONE got support for this technology (seriously Nvidia? The amount of display options you have is pitiful. Just utterly pitiful. Give an actual amount of Gsync options -.-).

It looks like next year, there will be a ridiculous amount of Freesync options, and only MORE options from import monitors. Really, I think AMD's LARGEST mistake has been not pushing Freesync enough and "Smooth gaming" enough.

Where AMD shines is the cost to get smooth gaming. Freesync + AMD is cheaper than Gsync + Nvidia, and for many gamers, a smooth gaming experience is what they want and that's the best way to do so.

Why surprised? It's an easy feature to implement without any licensing cost or royalties.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Hard question. SSAO and HBAO clearly create too much false shadow. Probably because these effect calculated on quarter resolution. HDAO in the other hand create too less. Maybe undersampling?
It is hard to optimize these SSAO algorithms, because there might be some scenarios where the results will be bad. This is why I hate SSAO in general. Obsurance fields do a better job nowadays, and this isn't a screen space effect.

I think between them though there is far better 3d depth with HDAO.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Why surprised? It's an easy feature to implement without any licensing cost or royalties.

Which is why the MIT license on this code is a big deal to developers and companies like intel.

No contract to sign with AMD, no royalties, no risk of AMD trying to change the deal later. If AMD tries to do anything besides keep it free and open then anyone is free to fork the code and work with the fork instead.

Will it succeed? Who knows. But it does have a real chance.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Which is why the MIT license on this code is a big deal to developers and companies like intel.

No contract to sign with AMD, no royalties, no risk of AMD trying to change the deal later. If AMD tries to do anything besides keep it free and open then anyone is free to fork the code and work with the fork instead.

Will it succeed? Who knows. But it does have a real chance.

It already has double the options of Gsync, with more options upcoming than Gsync.

I think the issue isn't whether people will make the monitors, it's whether anyone will actually buy the monitors FOR freesync use. The problem is people don't buy gpu/monitor together a lot of times they buy them separately, and people are using nvidia so they don't see a freesync monitor as a perk. So, really, I'm just not sure why anyone would make freesync monitors when AMD is such a tiny portion of the market.

With Intel supporting it though, that may change things a bit, but I'm AGAIN not sure why that matters as what benefit does that provide a weak intel iGPU?
 

SViscusi

Golden Member
Apr 12, 2000
1,200
8
81
With Intel supporting it though, that may change things a bit, but I'm AGAIN not sure why that matters as what benefit does that provide a weak intel iGPU?

I believe there are more advantages to adaptive sync than high end gaming. IIRC there's a power benefit which seems a pretty big deal for intel in the mobile market.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Intel is working very hard to modify the display engine in Kaby Lake to support A-sync. At the end of next year there will be nearly 100 Freesync ready monitors in the market, and these will probably work with Intel's implementation.

Got any documentation on this? Because it contradicts all Kaby Lake info. Or is it just your personal guess?