[PcGameshardware] The Witcher 3 Benchmark

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Its not the only weak point.

3dm-color.gif

tessmark.gif

I haven't installed the latest driver yet but my colour fill benchmark on my overclocked 780 beats a 290 but not a 960.


on the other hand my tessmark score thrashes the 970 in that chart.


Well, Maxwell has some significant compute enhancements:

http://devblogs.nvidia.com/parallel...ould-know-about-new-maxwell-gpu-architecture/

Benchmarks here from Anandtech: http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20

The only thing it doesn't excel at is double precision (unlike GK110/Titan), but that's not the type of compute workloads you see in games.

I've ran a few of these tests previously for another thread and my results paint a different picture. My luxmark score is 42% faster than Anandtech's 980 result:






Mixed results here:

Face detection is only 60% of the 980
Optical Flow is 11% faster than the 980
Particle simulation is 4% faster than the 980.

It seems things have improved since those charts were made. Hopefully the next set of drivers will focus more on Kepler improvements.
 
Last edited:

gamervivek

Senior member
Jan 17, 2011
490
53
91
No one has caught on to the fact that pcgameshardware.de always disables or at the very least locks GPU boost and that in this case it's resulting in an almost 50% clock advantage for the GM204 chips compared to old GK110 ones?

no?

I mean I don't know about you but I've got 56% OCing potential from 900MHz while I doubt the 1300MHz clocked 970 here is going to reach the same OC at 2.1GHz...

Interesting, strange if true.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Ouch, Kepler.

I like that they threw in some old cards. Why is a 1.25GB Fermi beating a 660 2GB? Fermi compute ftw? I'd love to see a GTX 580 3GB in this game. I wonder if it can touch a 670?
 

omek

Member
Nov 18, 2007
137
0
0
No, every game does not require a new driver release. However, Nvidia has released an optimized driver. TW3 is one of the most anticipated releases this year, and there is nothing from AMD other than talking well deserved crap about GameWorks. Less crap talk, more optimized drivers, especially for multi-GPU users such as myself.

Well... wait. Where and how is AMD talking crap? Is it PR or a croney or is it a cop-out to prevent someone from getting mad? I ask because I've seen no response or comment from any AMD representative...
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Well... wait. Where and how is AMD talking crap? Is it PR or a croney or is it a cop-out to prevent someone from getting mad? I ask because I've seen no response or comment from any AMD representative...
AMD roy spreading all shit around in is Tweet and saying bad developers stuff like that.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
I think AMD has been low key and classy about GameWorks myself. A welcome change for the company they most often either don't say anything or stick their foot in their mouth.
 
Feb 19, 2009
10,457
10
76
AMD roy spreading all shit around in is Tweet and saying bad developers stuff like that.

Where did he blame bad developers? I think you are confusing his comments to those who tweet to his account.

AMD Roy is pretty nice guy, too nice in fact for corporate warfare against the likes of NV.

@chimaxi83
You should buy NV if you put the blame on AMD for failing to provide CF support or "game ready" drivers for GameWorks titles. Because its not gonna happen. Every GW title has released with broken CF to date.
 
Feb 19, 2009
10,457
10
76
He was quite non-vocal regarding GW until recently. I think it finally got to him, he's pissed... retweeting tons of anti-GW tweets! lol
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Well... wait. Where and how is AMD talking crap? Is it PR or a croney or is it a cop-out to prevent someone from getting mad? I ask because I've seen no response or comment from any AMD representative...

I think AMD has been low key and classy about GameWorks myself. A welcome change for the company they most often either don't say anything or stick their foot in their mouth.
Yeah, I should have clarified. He has retweeted a load of tweets, all talking about how GW is hurting gaming. So yes, he is bringing attention to a very real issue, though not directly talking crap about it.
Where did he blame bad developers? I think you are confusing his comments to those who tweet to his account.

AMD Roy is pretty nice guy, too nice in fact for corporate warfare against the likes of NV.

@chimaxi83
You should buy NV if you put the blame on AMD for failing to provide CF support or "game ready" drivers for GameWorks titles. Because its not gonna happen. Every GW title has released with broken CF to date.
He retweets a lot of those comments. I would think that means he agrees, as do many of us. You're right though about my expectations of drivers for a GW title though. I was just mad, flickering sometimes while using CF is annoying as hell lol.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Posted in the Gaming section full details but my 780 Ti can hit 60FPS locked on the High preset with textures on Ultra. Sky isn't falling. Game looks fine.
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
They released that a long time ago, I even tested the Forward+ demo.

Which is what is used in Witcher 3 in fact. Forward + rendering to enable deferred engines to apply global lighting & MSAA via direct compute.

It's open source like all of AMD's features.

"CD project in-house engine uses for The Witcher 3 Direct X 11 and a Forward + renderer, texture blending and LoD system by Umbra 3 middleware save power and memory."

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Technik-Test-1158845/

From another thread, I think Silverforce might have solved the issue of Kepler performance in this title...

AAoQAnL.png


Ryan Smith said:
DirectCompute is the compute backend for C++ AMP on Windows, so this forms our other DirectCompute test.

As you can see here, in pure DirectCompute testing, Tahiti does better than the 780, but under the 780 Ti. Since TW3 uses Forward+ rendering, which is DirectCompute, you see Kepler tanking. I wish I could post compute results that included the 960, but Anandtech didn't review that card. If anyone can find more DirectCompute results, this may hold some weight...

Edit - Back to the original Titan review...

AETiSwe.png


Ryan Smith said:
Surprisingly, for all of its performance gains relative to GTX 680, Titan still falls notably behind the 7970GE here. Given Titan’s theoretical performance and the fundamental nature of this test we would have expected it to do better. But without additional cross-platform tests it’s hard to say whether this is something where AMD’s GCN architecture continues to shine over Kepler, or if perhaps it’s a weakness in NVIDIA’s current DirectCompute implementation for GK110. Time will tell on this one, but in the meantime this is the first solid sign that Tahiti may be more of a match for GK110 than it’s typically given credit for.
 
Last edited:

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
My Issue with Witcher 3 is every time I start the game I have to go into the graphics menu to switch from borderless fullscreen to fullscreen mode. I kept scratching my head on why the framerate was so low until I realized the top card was running 85C and the bottom card was running 35C.
 
Feb 19, 2009
10,457
10
76
I doubt the reason for Kepler tanking is Forward+ for the reason that its used in other games & engines as well.

It's open source nature means developers are able to optimize it fully or even choose which parts of their game uses it.

Generally open world games need deferred engines else performance is destroyed, but by going with that, they lose out on the ability to perform lots of post processing on materials, including global lighting, and AA is non functional + incurring a massive performance penalty. It's where Forward+ comes in to retain all the pros but solves the cons.

It's used in Crytek's engine:
http://www.dualshockers.com/2014/03...how-it-became-a-visual-showcase-for-xbox-one/

I don't see Kepler tanking in Crytek games, Ryse & Evolve runs great on 780/Ti/Titan.

Also Forza Horizon 2 uses Forward+
http://www.dualshockers.com/2014/08...-display-thousand-of-dynamic-lights-and-more/

Oh, all the Dirt games after Showdown used Forward+ too.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
In Ryse, 280X > Titan...

hlldaOA.jpg


Isn't that exactly what people are complaining about in TW3? Am I looking at out-of-date benchmarks? I believe Evolve is the same way, but I'll look that up as well...

It can be as open as it wants, if Kepler doesn't do well at DirectCompute, is won't do well in Forward+ rendering.

Edit - Evolve is better for the Titan, but it is still barely above the 280X, which is about equal to the 780.

FUdPbjM.jpg


Edit2 - 960 is still lower than most, maybe it is running into a VRAM bottleneck that doesn't show up in TW3?
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Is anyone else's fps capped at 30fps during gameplay? I have vsync off in game and in ccc and the frame rate limiter is set to unlimited. Not sure what's wrong.

edit: nm, it seems hair works was killing the fps by nearly half.
 
Last edited:

Udgnim

Diamond Member
Apr 16, 2008
3,681
124
106
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Last edited:

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Don't go by these benchmarks or any other. My gameplay is opposite of what you see on benchmarks. I'm getting a lot better fps than what you see in these benchmarks on my 780Ti. 50fps roughly with just shadows on low, Everything else on High.

Like mentioned before, the game defaults to windowed fullscreen, when i put it to fullscreen the FPS jump up from 30 to50ish.