PCGHRyseSon of Rome benchmarks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MaksFuski

Junior Member
Oct 10, 2014
2
0
0
this recorded with 2x2 supersampling
if you turn off supersampling you get 60+fps
this written just for information
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Ryse is definitely the first true next gen game in terms of graphics. There is no doubt about that. :thumbsup:

http://www.eurogamer.net/articles/digitalfoundry-vs-ryse-son-of-rome

" Ryse delivers the full suite of CryEngine features with excellent image quality and it's still just a launch title. Given the experience of working on such a product, we have little doubt that Crytek could produce a Crysis Trilogy of sorts for next-generation consoles with few compromises - and yes, we want it. "

Now when can we see more interesting true next gen games using the latest CRYENGINE like Star Citizen and even the next CRYSIS ? :)
 

Freddy1765

Senior member
May 3, 2011
389
1
81
guys, running this game @ Max detail (maybe because I enabled Native Upscaling?) runs @ 60 FPS no problem with my config.

i7-4770K @ 4.2
GTX 780
16GB
SBZ

running @ 1680x1050

What in the world are you doing on that resolution with a setup like that? :eek:
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
not to stir the pot, but compared to xbone version, there is not a huge difference.
Better lighting and shadows and slightly better textures.
http://www.eurogamer.net/articles/digitalfoundry-2014-ryse-pc-face-off

i wonder how much blow back will digitalfoundry get for this quote:
at this higher resolution that we began to run into performance issues that brought our frame-rate down, necessitating a 30fps lock for a consistent update. It's clear now why Crytek went out of its way to note that the 4K experience is designed for 30fps when using high-end GPUs, but thanks to the beautiful post-processing and a superb motion blur implementation, it still looks excellent at the 'cinematic' frame-rate.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You are right. How could i expect a great performing game as a nVidia user.
My bad. :hmm:

So i guess all this talk about Watch Dogs and Gameworks was just fud? Okay. I will remember this.
most games that nvidia puts their gameworks crap in turn out to be unoptimized garbage. and you cant always look at just framerate as games can play like crap even if the average framerate looks ok.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Game looks good, but a single 290x at 1080 at max settings (no super sampling) would have noticeable fps dips into the 20s during the opening battle scene. The opening scene was very intense, though. I can see why they wanted to cap it at 30 fps. The fps swings are a bit wild. One moment you're at 60 fps, the next moment you're in the 20s. That kind of swings will distract from the gaming experience.
 

psolord

Golden Member
Sep 16, 2009
1,916
1,194
136
Some site should take a look on the cpu side.

I saw all cores of my 4ghz 2500k hit 90% and that was just for the 45-50fps my 7950Ghz could give.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
There's AMD biased, and then there's this. Holy crap, this game is one-sided.


You're looking at the same benches I am? I see a 290X and GTX980 tied for the top, then the GTX780TI and GTX970 close after and then followed by the GTX280 and then GTX770... and so on. Nothing sounds way out of order. The GTX980 is generally faster than a 290X, but not so much so that some games being faster on the 290X is unreasonable.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You're looking at the same benches I am? I see a 290X and GTX980 tied for the top, then the GTX780TI and GTX970 close after and then followed by the GTX280 and then GTX770... and so on. Nothing sounds way out of order. The GTX980 is generally faster than a 290X, but not so much so that some games being faster on the 290X is unreasonable.
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,268
136
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.

It`s actually more surprising this isn`t happening more often. It just means Crytek did a really great job optimizing for the current crop of consoles right out of the gate, which means heavy GCN optimizations. As time goes on, you`ll see this happen more often. Future Nvidia architectures will most likely focus on patching up weak spots where GCN does a better job (Maxwell has already done this to an extent with some compute tasks) so it may or may not end up being a huge deal, but it does give the current crop of AMD cards an amount of future proofing.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
are you? the 290x is beating the 980 by 15-18% where as most other games even the 970 beats the 290x. a 290x is doing about 40% better in this game than it does on average.

Aren't those overclocked 290X's? I dunno, maybe the GTX980 is faster than I thought, but things don't look that out of the ordinary to me.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Aren't those overclocked 290X's? I dunno, maybe the GTX980 is faster than I thought, but things don't look that out of the ordinary to me.
that is an overclocked 980 too. and yes this is way out of the ordinary.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Maybe Nvidia has some stuff to do driver side. Or it made use of GCN's compute more.

Anyways doesn't matter, this game, as pretty as it looks, has the most boring gameplay ever.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
So basically from what I'm reading on this thread, there NEVER was a day when a new video card was released and was NOT better than the competitors hardware in EVERY single benchmark by a LARGE margin?
 

Abwx

Lifer
Apr 2, 2011
10,947
3,457
136
that is an overclocked 980 too. and yes this is way out of the ordinary.

Because it s a given that the 980 should be systematicaly better in all games and resolutions under all possible settings..??.

getgraphimg.php


http://www.hardware.fr/articles/928-16/benchmark-hitman-absolution.html
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Because it s a given that the 980 should be systematicaly better in all games and resolutions under all possible settings..??.

http://www.hardware.fr/getgraphimg.php?id=77&n=7

http://www.hardware.fr/articles/928-16/benchmark-hitman-absolution.html
what part of "out of the ordinary" are you confused about?

and that bench you just posted is way off as there is no way any of those cards are that fast. they must have used 2x MSAA not 4x MSAA.

EDIT: they are running 1920x1080 4x MSAA so someone there cant even get that straight
 
Last edited:

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
I think people are purposefully ignoring what the devs themselves said and was posted earlier. They made heavy use of GPU compute. AMD has beefier compute hardware so they see the benefit of that. Isn't maxwell relatively very weak in compute?
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
If it was compute, then the 780 Ti would be doing worse. Driver issues or bandwidth I would say.
 
Feb 19, 2009
10,457
10
76
Would not worry, NV will patch it up in an updated driver like they did for Dirt when GI lighting used dx11 compute or Hitman etc. They came out really bad for NV hardware but a few months later, they became equal or faster.

This is definitely a case of CryEngine taking full advantage of GCN in consoles and using shaders to perform compute for rendering. NV just needs to update their drivers. There's nothing inherently crap about Kepler or Maxwell in compute besides weak DP (neutered on non professional cards), I don't think they are using DP compute for their rendering because it would be insane when even AMD hardware is much faster at SP. But only the devs can elaborate.
 

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
If it was compute, then the 780 Ti would be doing worse. Driver issues or bandwidth I would say.

Bandwidth may be part of it, but wasn't it the devs themselves that talked about extensively using compute on the gpu?

Ok, looked up the quote its from the principal rendering engineer for Crytek:
We are making heavy use of some DX11 features like Compute Shaders, which however, perform better on some hardware architectures than others, so there will be some noticeable performance gaps between different desktop GPUs.

It seems they knew the performance might be counter-intuitive in comparison to other games. Perhaps NV will fix it in a patch but isn't the compute on AMD considerably better? I atleast remember that being the general consensus in recent generations.