OC3D :: DirectX 12 Performance Review Crimson 16.4.2 vs GeForce 364.96

csbin

Senior member
Feb 4, 2013
907
611
136
http://www.overclock3d.net/reviews/...son_16_4_2_-_directx_12_performance_boosted/7




4K

yeKWt.jpg




bsUJv.jpg


11ZhPA.jpg


MuJpz.jpg



1440P

T5YeJ.jpg


l18iO.jpg


Ax2O1.jpg



1080P

ilgRr.jpg


13ovur.jpg


3y7r6.jpg
 

Adored

Senior member
Mar 24, 2016
256
1
16
Nvidia driver crashes on Hitman now as well? This could be worth paying attention to with the same story in QB.

Would have been nice to see a 390X in there too.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
Agreed, 390X would of been an interesting comparison. In DX12 the 390X seems to be very close to Fury X performance.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Well give credit where credit is due; AMD's driver team has stepped up their game and should be proud!
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Well give credit where credit is due; AMD's driver team has stepped up their game and should be proud!

After reading they changed the tess level for Fallout 4 "Max" settings, I've been a bit weary.

Some of these gains just seem too good to be true. I wouldn't be surprised if they're are some IQ difference found.

AMD really has become more NV-like.
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
If they lower x64 tess level to x16 it's a good thing - no difference in visual quality at all and should be that way since day 1. Only if they lower something else it would be really bad.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
If they lower x64 tess level to x16 it's a good thing - no difference in visual quality at all and should be that way since day 1. Only if they lower something else it would be really bad.

That's the problem. In some games, it is visible. Even in Fallout 4 it's visible.

Anyways, trying to pass that off as "max" setting is my issue. It really isn't max settings anymore. And it gives a huge performance gain in the process, which just rubs me wrong.

No problem with a Tess slider, but if these changes are forced at the driver level - makes me wonder what else AMD is doing to get some of these huge gains.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
AMD should have the same "shader power/tess power" ratio as they had in Tonga, that equals Nvidia ratio. The very best anti-cheat emgine.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What does Fallout 4 have to do with this thread? Unless it can be pointed out that they gained performance in these games by reducing tess level behind the scenes?
 

Adored

Senior member
Mar 24, 2016
256
1
16
After reading they changed the tess level for Fallout 4 "Max" settings, I've been a bit weary.

Some of these gains just seem too good to be true. I wouldn't be surprised if they're are some IQ difference found.

AMD really has become more NV-like.

If AMD were lowering IQ to gain FPS, you can be sure Nvidia would let the press know all about it. ;)

That would of course open up a whole can of worms and discussion on the topic of tessellation and whether or not the differences are worth the FPS loss. That topic is something that Nvidia might feel they have a lot more to lose out of - and a possible reason for silence on AMD's 16x tess factor. They were really fast to jump on ATI's fp16 demotion after all.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
That's the problem. In some games, it is visible. Even in Fallout 4 it's visible.

Anyways, trying to pass that off as "max" setting is my issue. It really isn't max settings anymore. And it gives a huge performance gain in the process, which just rubs me wrong.

No problem with a Tess slider, but if these changes are forced at the driver level - makes me wonder what else AMD is doing to get some of these huge gains.

Do you have proofs for your claims. Please provide them if you do.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
Again...nothing to surprising since its fairly well established now that GCN cards were better suited for DX12 than Nvidias up until now. I'm sure this will change with Nvidias next gen cards, but as it stands DX12/Vulkan/Mantle are all suited for architecture more inline with GCN. AMD has just refined their drivers very well over the past year or so to really take advantage of what advantages they do have with their current architecture.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
That's the problem. In some games, it is visible. Even in Fallout 4 it's visible.

Anyways, trying to pass that off as "max" setting is my issue. It really isn't max settings anymore. And it gives a huge performance gain in the process, which just rubs me wrong.

No problem with a Tess slider, but if these changes are forced at the driver level - makes me wonder what else AMD is doing to get some of these huge gains.

I call BS on being able to tell the difference in gameplay between 64x and 16x. I did a ton of testing with The Witcher 3 which used 64x by default. I manually lowered to all available settings via the AMD control panel, 2x, 4x, 8x, 16x, and 64x. In a screen shot, you could tell if you zoomed way in on certain areas between 16x and 64x. But this is not valid as its not gameplay. However the HUGE boost to FPS made it well worth going down to 16x.
 
Feb 19, 2009
10,457
10
76
No testing for Quantum Break ?

In Quantum break, during combat the 390 gets nearly double the FPS as the 970. I guess after this driver, it's doubled. hah

DX12 = devs manage memory, no longer a driver/IHV responsibility.

970 = 3.5GB + 0.5GB gimpness... oh boy.
 
Feb 19, 2009
10,457
10
76
I call BS on being able to tell the difference in gameplay between 64x and 16x. I did a ton of testing with The Witcher 3 which used 64x by default. I manually lowered to all available settings via the AMD control panel, 2x, 4x, 8x, 16x, and 64x. In a screen shot, you could tell if you zoomed way in on certain areas between 16x and 64x. But this is not valid as its not gameplay. However the HUGE boost to FPS made it well worth going down to 16x.

CDPR updated Witcher 3 a few months after release, the changed default x64 to x32, and also added a slider which you can lower to x16 and x8. It's good to see devs go the extra mile to optimize.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
After reading they changed the tess level for Fallout 4 "Max" settings, I've been a bit weary.

Some of these gains just seem too good to be true. I wouldn't be surprised if they're are some IQ difference found.

AMD really has become more NV-like.


C'mon man. Please don't start stuff like this. Accusations from either camp shouldn't even be brought up. Only derails a good discussion.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Impressive performance from AMD. The stage is set for Polaris to take the fight to Nvidia and gain back market share.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
C'mon man. Please don't start stuff like this. Accusations from either camp shouldn't even be brought up. Only derails a good discussion.

Start stuff like what? I read it first here from Silverforce. Then a few other posters repeated, so I just assumed it to be true. The Fallout 4 change to tess.
 
Feb 19, 2009
10,457
10
76
Start stuff like what? I read it first here from Silverforce. Then a few other posters repeated, so I just assumed it to be true. The Fallout 4 change to tess.

It does but as far as gamers are concerned it does not reduce IQ which is what you were talking about.

If there was evidence of a visual degradation, tech sites would be all over that.

If you have Witcher 3 you can test it yourself with the tessellation slider. :)