Quantum Break: More like Quantum Broken.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
You forgot to prove it. And its funny how you already are willing to dismiss Star Swarm.

But I am looking forward to this. Because you can raise one hell of a lawsuit if its Intels fault. And that must be sweet music in your ears.

Star Swarm is a synthetic benchmark, it ain't a game. There's no dismissing, it's reality, it's a synthetic bench... unless you weren't aware. People buy hardware to play games.

It's not me who needs to prove it, if Intel is compatible with DX12 then they should be able to run DX12 games. But I've been googling and it's just not there.

How hard is it find?!

Let this be a lesson for paper spec claims of feature support, m'kay?

Don't trust it until it's proven.

Just like NV's "Maxwell supports DX12 Async Compute" or "Maxwell uarch is different, it supports graphics + compute" lies they told Anandtech.
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
Star Swarm is a synthetic benchmark, it ain't a game. There's no dismissing, it's reality, it's a synthetic bench... unless you weren't aware. People buy hardware to play games.

It's not me who needs to prove it, if Intel is compatible with DX12 then they should be able to run DX12 games. But I've been googling and it's just not there.

How hard is it find?!

Let this be a lesson for paper spec claims of feature support, m'kay?

Don't trust it until it's proven.

Just like NV's "Maxwell supports DX12 Async Compute" or "Maxwell uarch is different, it supports graphics + compute" lies they told Anandtech.
well intel showcased a vulkan demo some time ago but thats it nothing more than this till now
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Most funny thing is that on BETA 1 Nvidia was beating AMD on DX12 by a far margin and showing real improvement over DX12 on heavy batches and i do not know what happened in that BETA 2 that reduced Nvidia performance by 30% to 40%.
 
Feb 19, 2009
10,457
10
76
Most funny thing is that on BETA 1 Nvidia was beating AMD on DX12 by a far margin and showing real improvement over DX12 on heavy batches and i do not know what happened in that BETA 2 that reduced Nvidia performance by 30% to 40%.

The beta 1 according to Dan Baker on reddit/steam forum, removed a lot of effects as they were re-working the effects system. Also didn't have Async Compute enabled.

They added effects and AC for Beta 2.

Originally 7 months ago, in alpha, they had basic AC implemented, NV did not like it, told them to disable and Oxide complied.

See, not exactly trying to gimp NV.

Besides, NV keeps saying they have Async Compute, just not enabled. So are developers supposed to not use the feature or what? Why are they saying they have it but its disabled? o_O

Feb 2016, quite recent!

https://twitter.com/pellynv/status/702556025816125440

Fun FACT of the day: Async Compute is NOT enabled on the driver-side with public Game Ready Drivers. You need app-side + driver-side!

Btw, talk about Ashes here, all of these basic info have been covered many times -> http://forums.anandtech.com/showthread.php?t=2462951
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Most funny thing is that on BETA 1 Nvidia was beating AMD on DX12 by a far margin and showing real improvement over DX12 on heavy batches and i do not know what happened in that BETA 2 that reduced Nvidia performance by 30% to 40%.

I do know.

Alpha had preliminary shader driven effects which NVIDIA had removed via a driver update.

The BETA's added the whole slew of shader driven effects. Dan Baker mentioned, on overclock.net, that they were changing all of the shader effects for the final release. The lighting effects changed, the smoke trails etc.

During the Alpha, the compute to graphics ratio was 20:80. In the BETA and final release, it had grown to 40:60. It was mentioned by Dan Baker, at the time of the Alpha release, that the final game would be more compute heavy.

We're seeing this trend across a whole slew of DX12 and even DX11 titles.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
How would Hitman be better with DX11 only? Care to explain. Does it being DX11 only somehow remove the online DRM that players despise? Does it being DX11 only remove the episodic plan that Square Enix had in place years ago when they canceled IO's other games, fired staff and downsize so they can focus on Hitman episodic release to pay for the development cost?

Don't be daft man. You have more logic than that.

Ashes is finished, it's a large scale multiplayer RTS, it works, flawlessly, I've been playing it the past few days. No crashes, very stable performance. It's not for everybody, just as Planetary Annihilation and Grey Goo isn't for everybody. These are niche RTS, their sales will never be epic.

What's a recent RTS that sold heaps? Even the new Homeworld failed to generate mass interest. If Ashes reaches 250K units in a few months, that's already a ton of profit for a small studio like Oxide. Being profitable is the key part here.

The beta 1 according to Dan Baker on reddit/steam forum, removed a lot of effects as they were re-working the effects system. Also didn't have Async Compute enabled.

They added effects and AC for Beta 2.

Originally 7 months ago, in alpha, they had basic AC implemented, NV did not like it, told them to disable and Oxide complied.

See, not exactly trying to gimp NV.

Besides, NV keeps saying they have Async Compute, just not enabled. So are developers supposed to not use the feature or what? Why are they saying they have it but its disabled? o_O

Feb 2016, quite recent!

https://twitter.com/pellynv/status/702556025816125440



Btw, talk about Ashes here, all of these basic info have been covered many times -> http://forums.anandtech.com/showthread.php?t=2462951

So i am right that computing and Async crippled Nvidia performance ,however, those developers knew that.
 
Feb 19, 2009
10,457
10
76
So i am right that computing and Async crippled Nvidia performance ,however, those developers knew that.

Well no. You would be right if you can prove that Oxide did it to gimp NV deliberately, rather than just making an engine that complies to DX12 standards.

Because NV was busy telling everybody in the industry, including Microsoft, that it supports Async Compute.

I showed you proof, very recently from NV PR, they claim Async Compute is not enabled in their drivers yet! -_-

They've been saying since last year that they would enable it but .... nothing.

I'm sure they will enable it later this year. For Pascal.

ps. For all the NV fans, ask NV themselves: https://twitter.com/pellynv/status/702556025816125440 <-- ask them when, an ETA even. Because you bought DX12 hardware that claims Async Compute support but it hasn't happened! Instead of blaming NVIDIA for lying to you about their hardware, you are blaming AMD and developers.... WTH is that kind of sick logic.

What next? Are you guys going to blame developers for using all 4GB of the 970 in new games to cause it to stutter? https://youtu.be/Gki0dV2kcqM?t=2m12s

Are they gimping NV hardware on purpose or something trying to use above 3.5GB of vram? WTH?!
 
Last edited:

nvgpu

Senior member
Sep 12, 2014
629
202
81
It's blatant sabotage by Oxide paid off by AMD to cripple performance on Nvidia hardware. Inept developers writing bad code, hence everyone is ignoring the game and boycotting it pretty much.





You just cannot make this stuff up and post it.

If you have proof of such a statement, I am guessing that you would have posted it.




esquared
Anandtech Forum Director
 
Last edited by a moderator:

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
So i am right that computing and Async crippled Nvidia performance ,however, those developers knew that.
the driver was exposing async capabilities and when they tried it the driver was crashing
so they created a secondary path

tl dr version from 5 months ago
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Well no. You would be right if you can prove that Oxide did it to gimp NV deliberately, rather than just making an engine that complies to DX12 standards.

Because NV was busy telling everybody in the industry, including Microsoft, that it supports Async Compute.

I showed you proof, very recently from NV PR, they claim Async Compute is not enabled in their drivers yet! -_-

They've been saying since last year that they would enable it but .... nothing.

I'm sure they will enable it later this year. For Pascal.

ps. For all the NV fans, ask NV themselves: https://twitter.com/pellynv/status/702556025816125440 <-- ask them when, an ETA even. Because you bought DX12 hardware that claims Async Compute support but it hasn't happened! Instead of blaming NVIDIA for lying to you about their hardware, you are blaming AMD and developers.... WTH is that kind of sick logic.

What next? Are you guys going to blame developers for using all 4GB of the 970 in new games to cause it to stutter? https://youtu.be/Gki0dV2kcqM?t=2m12s

Are they gimping NV hardware on purpose or something trying to use above 3.5GB of vram? WTH?!
They knew from the first alpha where they stated nvidia Async problem.
 
Feb 19, 2009
10,457
10
76
They knew from the first alpha where they stated nvidia Async problem.

Which they contacted NV and NV told them they will fix it in their drivers... LOL.

They are still making those absurd claims as recent as Feb 2016.

Seriously, don't you guys feel cheated at all? Not even a little bit?

Instead of wasting time here blaming AMD or Oxide, or other DX12 devs for using a standard DX12 feature... go ask NV to enable it in their drivers.

https://twitter.com/pellynv/status/702556025816125440
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Which they contacted NV and NV told them they will fix it in their drivers... LOL.

They are still making those absurd claims as recent as Feb 2016.

Seriously, don't you guys feel cheated at all? Not even a little bit?

Instead of wasting time here blaming AMD or Oxide, or other DX12 devs for using a standard DX12 feature... go ask NV to enable it in their drivers.

https://twitter.com/pellynv/status/702556025816125440

Or better yet go ask Nvidia to make better DX12 hardware ... :)

The red team points fingers at GameWorks but the green team points fingers at devs and AMD, it's funny how double standards work both ways ...

Don't direct the rage for devs making DX12 valid implementation, blame Nvidia for not featuring DX12 in ANY games so far ... D:

Why isn't anyone on the green team asking Nvidia to feature DX12 in games ? ():)
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
You would be right if you can prove that Oxide did it to gimp NV deliberately...

So now proof is necessary when accusing someone of crippling performance?

We'll have to keep that in mind the next time Nvidia's GameWorks is blamed. :awe:
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
So now proof is necessary when accusing someone of crippling performance?

We'll have to keep that in mind the next time Nvidia's GameWorks is blamed. :awe:
since when having two paths for specifically for not being crippled is being crippled?
#logic 2016
 
Feb 19, 2009
10,457
10
76
So now proof is necessary when accusing someone of crippling performance?

We'll have to keep that in mind the next time Nvidia's GameWorks is blamed. :awe:

Meh, saw that coming as soon as I typed the above.

You know it's really easy.

One, Oxide gives NV full source code, even in Alpha. NV can release "Game Ready" for Ashes ALPHA.

AMD sponsorship does not add anything hidden being code obfuscation or encryption. Thus, whatever the devs do, NV has full access to it in plain sight. Nothing to hide at all.

So we already have proof that AMD doesn't gimp NV, because they don't forbid anyone from modifying their features. Welcome all to GPUOpen. TressFX3.0 -> PureHair ring a bell?

Basically everything GameWorks isn't.

Even their most recent GW "Open Source" attempt is a damn lie. GameWorks DLLs can't be decoded or modified, it can't be altered at all without NV's permission. Thus you can't optimize it. NV only allows peeks into the demos and it's source code, if you sign a developer agreement with NVIDIA, sign the EULA, and accept that NV can remove your rights anytime, for any reason. As far away from Open Source as one could imagine.
 
Feb 19, 2009
10,457
10
76
Video from Digital Foundry, less stutter on 144hz monitor?

https://www.youtube.com/watch?v=zK2BUeYqLVI < --- 390 pwning the 970 hard. 50% faster, even more, nearly twice as fast during action scenes. Dat Async Compute volumetric lighting!

ColFSp3.jpg
 
Last edited:

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Meh, saw that coming as soon as I typed the above.

You know it's really easy.

One, Oxide gives NV full source code, even in Alpha. NV can release "Game Ready" for Ashes ALPHA.

AMD sponsorship does not add anything hidden being code obfuscation or encryption. Thus, whatever the devs do, NV has full access to it in plain sight. Nothing to hide at all.

So we already have proof that AMD doesn't gimp NV, because they don't forbid anyone from modifying their features. Welcome all to GPUOpen. TressFX3.0 -> PureHair ring a bell?

Basically everything GameWorks isn't.

Even their most recent GW "Open Source" attempt is a damn lie. GameWorks DLLs can't be decoded or modified, it can't be altered at all without NV's permission. Thus you can't optimize it. NV only allows peeks into the demos and it's source code, if you sign a developer agreement with NVIDIA, sign the EULA, and accept that NV can remove your rights anytime, for any reason. As far away from Open Source as one could imagine.

Wrong the point is that they used extreme computing since Beta 2 which even code cannot fix just because everyone knows that Nvidia is weak on computing just like AMD is weak in tessellation which even coding cannot fix.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Video from Digital Foundry, less stutter on 144hz monitor?

https://www.youtube.com/watch?v=zK2BUeYqLVI < --- 390 pwning the 970 hard. 50% faster, even more, nearly twice as fast during action scenes. Dat Async Compute volumetric lighting!

ColFSp3.jpg
Wrong

". The frame rate holds at 50-60 indoors, dropping to 40 outdoors. Some have suggested the game could have been unpacking data in the background, as I ran it just after the download finished. But these problems persisted for a good hour.I&#8217;m running a GTX 970, an i7-5820K CPU clocked at 3.30GHz, 16GB of RAM, and playing the game on ultra settings at 1080p with the latest NVIDIA drivers. And, if it makes any difference, I have the game installed on an SSD. Hardly a monster PC, but well within the recommended specs. I tried playing at higher resolutions on my 4K monitor, but the frame rate, unsurprisingly, took a massive hit."
http://www.pcgamer.com/quantum-break-port-impressions/




Low level API is bound too flop in PC due to many different setup which is causing massive problem. It will take 10X amount of time for the developers to port stable and good DX12 port compare to DX11.
 

kraatus77

Senior member
Aug 26, 2015
266
59
101
Wrong the point is that they used extreme computing since Beta 2 which even code cannot fix just because everyone knows that Nvidia is weak on computing just like AMD is weak in tessellation which even coding cannot fix.
so you admit nvidia's hardware is inferior. :D
 
Feb 19, 2009
10,457
10
76
Last edited: