apparently even 980Ti can't handle async-compute without a crippling context switch

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
I'm not saying it's entirely settled in my mind, but from what Oxide has said and the way all the data is going coupled with Nvidia's pointed silence, it seems far more likely to be true at this point. If not, Nvidia should probably say something sooner than later.

Or they just want to let the games do the talking. If this true it's a smart play. I think you are going to see other games are not as reliant on async compute.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I didn't put all my eggs in one basket. See my rigs below.

Even if you just bought a 980Ti right now, it's not like the card is all of a sudden garbage. It's the fastest card on the market right now....

NV is going to work with devs to figure out how to make sure their card gets good performance. If you get bad performance in DX12 games.... Buy a 290x. They're $250. You're a 980Ti owner.... just call it your "Phys X card".

You still have the best DX11 performance and there are TONS of games that still use that.... and tons of new games coming out using it.

And you can do this magical thing called BUYING a NEW GPU NEXT YEAR.
"Wah, my 980Ti isn't as good, I should have gotten a Fury X Nvidia screwed me."
Sunk cost, get over it.
If you want that sunk cost to influence your decision next gen so be it. It's like your 980Ti was ever being held onto past Pascal. When Pascal drops, that 980Ti is dropping out of your rig too.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Or they just want to let the games do the talking. If this true it's a smart play. I think you are going to see other games are not as reliant on async compute.

And there is no reason to talk about something which not really exist.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Or they just want to let the games do the talking. If this true it's a smart play. I think you are going to see other games are not as reliant on async compute.

Or they'll follow Nvidia's recommended configurations and performance won't suffer. Someone said on reddit that Nvidia has some recommended ways to handle this that doesn't kill performance that they spoke about previously at a conference, but it's extra work to put in. I think it's way too soon to claim one thing or another though.

I'm just an observer. My next couple games (Metal Gear Solid V, and Fallout 4) are not DX12 games. Beyond that I don't have anything I'm looking forward to right now.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
lol you said the same thing in another thread.

I was just having a little fun. Seriously I weighed the pros and cons of a GTX980TI vs a Fury X when I decided to move the twin R9 290s in CF to my 4790k build. Can't speak about the performance of the Fury X since I don't own one BUT the GTX980TI is a BEAST of a card. Likewise my 290s have served me well, especially since I water cooled them.

Ladies and Gentlemen, make no mistake about it, this is a dog eat dog fight by AMD and Nvidia (and has become vicious since the last report of respective market shares).

I decided on the GRX 980TI because it gave me the best overall performance for the buck. In most benchmarks it seems to be ahead of the FuryX.
 

Osjur

Member
Sep 21, 2013
92
19
81
If you guys actually would read a proper forums where programmers are testing this right now, you would know that at least for now, async computing doesn't work on nvidia hardware in DX12.

here's the thread: https://forum.beyond3d.com/threads/dx12-performance-thread.57188/page-8

I suggest start at page 7 or 8 where MDolenc has made a single program to test out async compute.

With Nvidia hardware you get this:
pJqBBDS.png


Now, if async compute would actually work, then rendering times should not increase. In Nvidia's case, it's always a sum of compute + graphics.

this is comparing 980 Ti to Fury X

ac_980ti_vs_fury_x.png


Don't take the picture as a performance indicator, its just a "simple" program to queue both graphics and compute pipelines.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
You know what? I am done moderating this thread, this has already been discussed and the title is misleading too.

Thread Closed and I am not even discussing this.


-Rvenger
 
Status
Not open for further replies.