• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Doom to use Asynchronous Compute.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Simple, any major AAA game not using AC on consoles is doomed 😉
Just Cause 3 is a pretty enlightening case study on that. Lack of offloading physics onto the GPU leads to a wonderful <20 FPS experience when you're engaging in the core selling point of the game and the engine doesn't even seem to support frame skipping.
 
Simple, any major AAA game not using AC on consoles is doomed 😉

Just Cause 3 is a pretty enlightening case study on that. Lack of offloading physics onto the GPU leads to a wonderful <20 FPS experience when you're engaging in the core selling point of the game and the engine doesn't even seem to support frame skipping.

Exactly. These studios aren't stupid, they know they have to extract peak performance from consoles because that's the majority of their market. A good console gaming experience first, then the PC port comes second. Reality.

As time goes on, and devs push the graphics boundaries with AAA titles, they are going to have to use every technique available for the hardware.
 
Doom releases may 13th. That has already been announced.
 
Different engines, you can't exactly use the Nitrous engine as reference point when each engines vary in assets, pipeline, and other things ...

It took Snail Games six weeks to do a BARE BONES DX12 port on King of Wushu ...

They where one of the first to do it. I fully expect the time requirement to be less when devs get more expertise with it.

Snail Games... I guess they name themselves that for a reason 😉
 
Well, nearly every game is released before its ready. Why should this one be any different? I'm sure it'll run great on Vulkan 5 months down the line at least.
 
I hope Developers learn something from Hitman launch on PC.

It's kind of funny how much nvidia fanboys get tied up in knots, because Nvidia can't optimize their drivers for the next generation of games, which is their own fault. Yet. these were the same people that criticized AMD for day one drivers in gimpwork titles, which AMD has almost no control over.

So, when Nvidia monopolizes the market with blackbox middleware, everything is fine, and AMD needs to fix it's act.

When AMD propels the gaming industry into the next era, instead of Nvidia handicapping it, it's AMDs fault that Nvidia's drivers don't properly support DX12.

Ohh the sweet irony is delicious!
 
Last edited:
The issue isn't that nVidia does not support it. Its that they constantly lie about supporting it in hardware, when they do not. They emulate it in software for compatibility, but their performance goes down.

Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC
https://blogs.nvidia.com/blog/2014/03/20/directx-12/

never gets old
 
Well this thread started off ugly.


Definitely excited about Doom. As long as I get 60 FPS @ 1440p, I don't care if the game renders AMD logos all over my screen.
 
Well this thread started off ugly.


Definitely excited about Doom. As long as I get 60 FPS @ 1440p, I don't care if the game renders AMD logos all over my screen.

Unfortunately pretty much every VC&G thread I've looked at this morning has been full of pages of ugly. It seems there needs to big a big report button for the entire VC&G board.

And yeah, I'm excited for Doom too.
 
haah this is like grade school 🙂 we are not winning? time to use the teachers by crying first. damn, just realize this is standard recess/playground tactic.





Threadcrapping not allowed.


esquared
Anandtech Forum Dirctor
 
Last edited by a moderator:
haah this is like grade school 🙂 we are not winning? time to use the teachers by crying first. damn, just realize this is standard recess/playground tactic.

Pretty much. Even when you aren't breaking a rule you're breaking a rule to someone.

I just saw a picture from the GDC...

PC Gamer doesn't mean what it use to be. 🙁
 
I guess that means we won't see it on PC since it's releasing in less than 2 months ...

Yeah there is no way Doom IV, a game which has been in development for over 4+ years, is using the Vulkan API of which the version 1.0 of the API was literally just released a month or two ago. It should be OpenGL 4.5 with the Asynchronous Compute portions coming from OpenCL like described above.

I'd expect AMD to actually have better performance than nVidia for this reason.
 
Reminds me of the Mantle and then later DX12/Vulkan discussions on here.

Some folks were of the view, back then, that Maxwell was fully capable on hardware for AC, despite the limitation of a single engine design that hasn't shifted from Fermi & Kepler. It went against logic of having multiple separate engines that these next-gen API use.

Then when its found out it's lacking hardware for it, some were of the view, by the time games that use AC are here, these current GPUs wouldn't matter.

Really? R290X is still putting out very impressive performance at 1080 and 1440p. I'm seeing huge gains in Hitman DX12 personally. These GPUs are in fact still capable, there's no reason to obsolete them ahead of time just to sell more GPUs.

Basically having hardware support for a major performance enhancement feature is counter-productive for GPU companies that want to sell more GPUs more often. Because these features extend the performance capability lifespan of GPUs, it doesn't cause an incentive for upgrades.

So whenever anyone raises the question of future proofing, consider the uarch that was made to shine with DX12/Vulkan.

So the fact that Hyper-Q is a thing means nothing? I'm not sure at the time we could have reasonably expected async compute wouldn't work when both GK110 and Maxwell have technology to specifically enable async compute with CUDA.
 
So the fact that Hyper-Q is a thing means nothing? I'm not sure at the time we could have reasonably expected async compute wouldn't work when both GK110 and Maxwell have technology to specifically enable async compute with CUDA.

because its limited and its not even close to what microsoft calls(and made) asynchronous
 
So the fact that Hyper-Q is a thing means nothing? I'm not sure at the time we could have reasonably expected async compute wouldn't work when both GK110 and Maxwell have technology to specifically enable async compute with CUDA.

Apparently not in conjunction with graphics though.
 
Apparently not in conjunction with graphics though.

Hyper-Q (with CUDA) does support concurrent graphics + compute[1] but it's not compatible with DX12. Apparently it may have something to do with resource barriers and the differences between CUDA and the DX12 equivalent.

because its limited and its not even close to what microsoft calls(and made) asynchronous

I'm not sure how great a difference there is between the two but the conversation was referring to a time when DX12 was just announced or the only thing we had to compare it to was Mantle. I'm not saying it was handled in the best possible way but I think we have to at least be fair that there existed a feature that theoretically could've provided async compute when we had no additional information as to the capabilities of the hardware or the DX12 API.

[1] http://ext3h.makegames.de/DX12_Compute.html
 
Last edited:
Back
Top