• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

[DigitalFoundry] Hitman PC Beta GTX 970 vs R9 390 Frame-Rate Test

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

casiofx

Senior member
Mar 24, 2015
369
35
61
I'd buy an AMD card in a heartbeat if they'd add adaptive vertical sync to their drivers (not Freesync - I like my current monitor). I've tried hoary old RadeonPro and it just causes tearing on the bottom one inch of my screen. NV's had it for like five years now.
Did you tried "Frame Rate Target Control" ?
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,078
205
106
He just showed them
Remember how I said "specific" ? He didn't show any and steam hardware survey is not a reliable source either on top of the fact that survey's are opt in and to top it off he bases his estimate on registered accounts instead of active userbase ... (He should feel bad about pushing out misleading figures like how 6 million users on steam alone has GTX 970s when it's practically HALF of steam's peak concurrent user count!)

We don't even know how many samples were used in the process either and neither can we determine the amount of bias in the sampling either ...
 

gamervivek

Senior member
Jan 17, 2011
482
19
81
Objectively, GCN support and performance has increased vis-à-vis its Kepler and even Maxwell counterparts. The prime example of this overall trend is the Radeon R9 290, which launched in late 2013 and retrospectively has been a far better purchase than a GTX 780. Looking at 2016 benchmark results from games such as The Division and Hitman (to name a few), a R9 290 was a better purchase than the newer GTX 970. The R9 390 is better still.

Looking forward to VR... using the SteamVR benchmarking tool, we can see this trend reflected in results posted in this forum:
http://forums.anandtech.com/showthread.php?t=2464814&page=2
Kepler does look pretty poor in that VR bencmark.

http://www.overclock.net/t/1592484/gamersnexus-amd-posts-internal-steamvr-performance-test-results/60#post_24937718

http://www.overclock.net/t/1592484/gamersnexus-amd-posts-internal-steamvr-performance-test-results/50#post_24936528
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Did you tried "Frame Rate Target Control" ?
It is the best. I use it with freesync and I have no more tearing and my framerate is rock solid at the value I set.

I have tuned lg29UM67 to 32-75Hz freesync range and I play with frame target of 50-60 fps (depends on a game), so that my framerate even in the most demanding scenes is very close to the cap. That way you don't notice any slowdowns.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
It is the best. I use it with freesync and I have no more tearing and my framerate is rock solid at the value I set.

I have tuned lg29UM67 to 32-75Hz freesync range and I play with frame target of 50-60 fps (depends on a game), so that my framerate even in the most demanding scenes is very close to the cap. That way you don't notice any slowdowns.
Does LFC work in that range you tuned?
 

airfathaaaaa

Senior member
Feb 12, 2016
692
12
81
I'd buy an AMD card in a heartbeat if they'd add adaptive vertical sync to their drivers (not Freesync - I like my current monitor). I've tried hoary old RadeonPro and it just causes tearing on the bottom one inch of my screen. NV's had it for like five years now.
wddm 2.0 makes adaptive v sync irrelevant you can see that the feature its there on radeon pro but not enabled since it doesnt really add nothing to it nowdays
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Well there's the performance and then there's the experience. With Kepler, you're likely to vomit all over the place after extensive VR use (24ms latency higher than Maxwell). Maxwell is a little bit better in terms of motion sickness due to a better support of Asynchronous Time Warp using coarse grained preemption.

According to Nvidia, its VR-optimized Maxwell GPUs can reduce latency by as much as 24ms compared to earlier GPU generations. This is a major deal when you figure that good VR experiences are aiming for 20ms or less of latency, and current Nvidia GPUs are part of a pipeline that is 57ms total.
That's as low as 34ms for Maxwell. VR aims for 20ms.

GCN leaves both in the dust due to the finer-grained preemption used for Asynchronous Warp (yes the ACEs). Heck LiquidVR won a prestigious award: http://www.roadtovr.com/amds-liquid-vr-initiative-wins-ais-lumiere-award/

This is why AMD are pushing VR, their architecture is better suited for it. LiquidVR is the better VR implementation when compared to Gameworks VR.
http://www.anandtech.com/show/9043/amds-liquidvr-announced-amd-gets-expanded-vr-headset-functionality

In the context of virtual reality, when rendering images to virtual reality headsets, low latency is critical to avoid nausea and motion sickness. One of the challenges associated with reducing latency is ensuring that the GPU has access to the latest head position information when it is rendering each frame. Even if the information is fresh when rendering begins, it may already be stale by the time the frame completes.

Asynchronous Time Warp addresses this issue by obtaining fresh head tracking information after each frame finishes rendering, using it to warp the frame to appear as if it was rendered from the new viewpoint. This warping makes use of a compute shader, and needs to be executed with high priority to avoid adding latency back into the pipeline. Executing it asynchronously allows latency to be minimized and helps eliminate stuttering, since context switching and pre-emption overhead can be avoided.*
http://developer.amd.com/tools-and-sdks/graphics-development/liquidvr/

So while FPS is important for running the games in VR, latency is far more important when it comes to reducing nausea caused by motion sickness.

AMD have planned for VR way before NVIDIA. Don't take my word for it, Oculus said this: http://www.tomshardware.com/news/oculus-oculus-connect-vr-amd-nvidia,27729.html

According to Tom Forsyth, Oculus VR's Software Architect, this is a new enhancement for Nvidia GPUs that is*already*available in AMD graphics cards. In fact, some of these optimizations can be found in current Xbox One and Sony PlayStation 4 platforms as part of their respective SDK. Why? Because they are based on AMD GPUs.

In my opinion, the Nvidia reveal is a good news story for everyone. It's less about saying one GPU brand is better than another for VR; instead, this is about introducing a new choice and widening the landscape for GPU consumers interested in VR.
So for NVIDIA, great VR support really hinges on Pascal. So eventhough Kepler, if overclocked, can perform nicely... The latency associated with a Time Warp makes it relatively useless for VR.

Maxwell should be descent. A large improvement over Kepler for sure. GCN, well, it should run VR quite well.

As for Polaris, better than GCN.
 
Last edited:

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,015
91
So you're unfamiliar with the concept of sampling bias?
The hardware survey is garbage tbh. I've yet to get it on my desktop since I got my 290 years ago, and gone through ~4-5 fresh installs (7, 8, 8.1, 10 and new SSDs between as well).

Yet first time I installed it on my laptops, "Would you like to partake?"...
 

Kenmitch

Diamond Member
Oct 10, 1999
8,502
2,244
136
The hardware survey is garbage tbh. I've yet to get it on my desktop since I got my 290 years ago, and gone through ~4-5 fresh installs (7, 8, 8.1, 10 and new SSDs between as well).

Yet first time I installed it on my laptops, "Would you like to partake?"...
I haven't seen it either.

Looking at it today shows Nvidia has 54.83% and AMD has 26.21% usage.

Funky math required to make that into the preached 80/20(or less)% split in favor of Nvidia.

On another note steam system information shows my gpu's don't even have any detected memory....Go figure.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,015
91
I haven't seen it either.

Looking at it today shows Nvidia has 54.83% and AMD has 26.21% usage.

Funky math required to make that into the preached 80/20(or less)% split in favor of Nvidia.

On another note steam system information shows my gpu's don't even have any detected memory....Go figure.
Ah there we go, have to go to Help->System Info to manually take it. Great now my result finally counts ;)
 

Kenmitch

Diamond Member
Oct 10, 1999
8,502
2,244
136
Ah there we go, have to go to Help->System Info to manually take it. Great now my result finally counts ;)
That's how it works? Guess mine counts also.

Downloading the game currently on steam to see how it is. I played the last one and it was ok I thought.
 

Hitman928

Diamond Member
Apr 15, 2012
3,634
4,054
136
Ah there we go, have to go to Help->System Info to manually take it. Great now my result finally counts ;)
Did you actually get a survey invite that way or just your system info? I just get system info which doesn't mean it is added to the survey data.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,015
91
Did you actually get a survey invite that way or just your system info? I just get system info which doesn't mean it is added to the survey data.
Got my system info, but I'm assuming they also took a copy. Maybe not.
 

Hitman928

Diamond Member
Apr 15, 2012
3,634
4,054
136
Got my system info, but I'm assuming they also took a copy. Maybe not.
Yeah, I don't think that makes you a participant. When you get a survey they'll ask you about your internet speeds and such as well.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
That's how it works? Guess mine counts also.

Downloading the game currently on steam to see how it is. I played the last one and it was ok I thought.
I may be in the minority but I liked the last one better. It had a scoring system that motivated you to find novel ways of doing the hit. You lost points for setting off an alarm or killing someone unnecessarily. With this one you can kill whoever you want and setting off an alarm generally means you have to start from a save.

Here it all seems so spoon fed that you can track opportunities that are spoon fed to you. In absolution you had to figure it out yourself. Honestly many of which I didn't find until I read a walk through. But when you do find it yourself on your own it's much more rewarding. And these games are sort of like puzzles which you play and replay. So the points system encourages you to do better. With this game, I know they are tutorials, it doesn't really make you want to do a level more than twice.

I know a lot has been said about triggers. I don't like them. The old system of just tracking and watching seemed more natural.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0


Carry on. :)


Off-topic and trolling are not allowed.
Markfw900
 
Last edited by a moderator:

ASK THE COMMUNITY