[PcGameshardware] The Witcher 3 Benchmark

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tg2708

Senior member
May 23, 2013
687
20
81
Having to agree with all the points about kelpers current yet questionable performance as well as possible maxwell neglect (time will tell) when pascal releases I'm now a bit wary of Nvidia products. I still think they make good products but I would hate to go sli or single top end only to find out not even two will give me optimal performance for a year or two.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Not related to the discussed issue but curious that in the first pic the 290 is at 17759 and jump suddenly to 27767 in the second pic, a score higher than the 290X in the third pic...

Not the same for the nVidia cards from first pic, the oldest, to the most recent one, quite indicative of Techreport usual shilling...

Off-topic but wanted to address your point:

Good catch. I didn't even notice that. I provided AT's measurements too as a 2nd point of reference. TR did everything possible in their reviews to recommend 960 over 290/280X and consistently downplayed the existence of after-market HD7950/7970/R9 290/290X cards since Tahiti XT/Hawaii XT launched. Also, they wouldn't stop their FCAT testing of HD7000 series to prove how much CF sucked but I haven't seen a single FCAT testing on their site since Maxwell launched.

32921

32919


32925

32923


32945

32943


32957

32955


IMO, Tech Report's reputation was shot already when they showed double standards for FCAT testing with how they approached NV vs. AMD setups. That's why I use a lot of European sites now as they are less likely to play favourites and don't get invited to NV corporate events/outings or get showered with gifts.

A while back Silverforce11 basically showed a video where how the North American reviewers get treated by NV at corporate events.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
No wonder we don't see FCAT testing from the usual suspects anymore, the reversal of fortunes so to speak is unbelievable.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
I'll chime in as the second poster to mention the in-game vsync seems to break SLI frame pacing. Massive microstutter/stuttering with it as fps basically goes from 45-60 100 times a minute. It feels like you are playing at 10fps. Of course I'm sure you only see this issue if you are barely holding 60fps with dual cards.

If you are running SLI definitely use the control panel vsync--its buttering smooth. With it when I dip to around 57fps or so I'm not swinging violently down to 45fps and back to 60 like with the in-game vsync.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Lol, so gtx780 SLI is on par with a single 290x.

A single R9 280X/HD7970Ghz is faster than GTX690 (nearly 680 SLI), while 670 is losing to an R9 270. Another perspective is a $180 GTX960 is just 3 fps behind $1000 GTX690 and $650 GTX780. D:

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_u_off.jpg


Hopefully more sites do testing with GameWorks off and retest TW3 with latest drivers from NV/AMD around Fiji's launch to see if anything changes as far as Kepler performance goes. We'll be very interesting to compare CF vs. SLI frame times in this game. HardOCP reported that CF exhibited more stuttering and they needed higher FPS to enjoy the game with R9 290Xs than with 970/980 SLI setups. I think it's a mistake that sites stopped testing FCAT because FPS alone don't tell us the full story.

For example, in TW3, the Titan X is about as fast as 970 SLI but what if the Titan X actually feels smoother?
 
Last edited:
Feb 19, 2009
10,457
10
76
For example, in TW3, the Titan X is about as fast as 970 SLI but what if the Titan X actually feels smoother?

As said, we do need sites to keep on using FCAT, because its one of the advantages of having lots of vram, the fps may be similar, but when the 970 stuttering in SoM, we know less vram will affect frame times.

For some, this may be a reason for going for Titan X SLI setups over 980Ti or 390X multi-GPUs, pay the premium for extra vram & potentially smoother gameplay.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It also seems some people aren't happy that their two SLI 780 Tis that they spent big $$$ on aren't performing well with GameWorks enabled. Honestly they should be happy that these cards still perform well without Hairworks enabled. 10+ years ago graphics cards were truly being made irrelevant (not being able to play the latest games at all) in a matter of a year or two. An example would be the Radeon X850 XT, this card was a pixel shader 2.0 card. Shortly after this card came out the shader model 3.0 revolution happened (around the time the Xbox 360 became popular). This rendered the card almost completely obsolete in a matter of a year or two. The Radeon X850 XT was considered a good card at the time and it wasn't chastised because people back then knew that shader model 3.0 was the future and that R400 was going to obsolete. Things were even worse during the late 90s.

That's different. During those eras, GPU performance increased at the rate of 75-100% every 18-24 months, like clock work. No one who bought $500+ cards held on to a GPU for more than 2 years during those days. Everything you bought was obsolete in a matter of 12 months. That's why no oen cared that X850 XT PE didn't have SM3.0 support because we knew we would be moving on to X1800/1900 series or NV equivalent.

Also, if you pay close attention, Kepler bombs in GameWorks games but it performs well in almost all AMD Gaming Evolved games and very well in vendor agnostic games. If Kepler showed bad performance in almost all modern games, then you'd have a point.

In today's gaming environment, it's not normal for a $700-1000 flagship cards like 780Ti/Titan to be barely faster than a $299 R9 280X and lose badly to a $330 970. There was 2 times when NV's architecture destroyed their cards' performance - DX9 games with all GeForce 5s and next gen shader intensive games with GeForce 7. However, it's hard to pinpoint Kepler's particular weakness since like others noted, this primarilly shows up in GameWorks titles.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Oh now the real picture is emerging. First GameGPU retested the game without Hairworks and it shows 290X > 970, 290 > 780Ti and 280X > 780.

Now Computerbase stepped up to the plate.

Legend for clock speeds:

R9 290X "OC" = 1030 mhz
R9 290 "OC" = 1007mhz
***In the past, Computerbase used reference 290 clocked at 862-901 MHz and 290X clocked at 838-871 MHz. o_O :eek: :sneaky:

980 "OC" = 1,177 MHz base clock (reference card only has a base clock of 1126mhz)
970 "OC" = 1,184 MHz (reference card only has a base clock of 1050mhz)


Without HairWorks:

Titan X = 59.5 fps
980 = 49.1 fps
290X = 44.1 fps
290 = 41.2 fps
970 = 40 fps
780Ti = 35.4 fps :rolleyes:
780 = 29.6 fps
285 = 28.2 fps
280X = 28.1 fps
http://www.computerbase.de/2015-05/...diagramm-grafikkarten-benchmarks-in-1920-1080

Essentially the game is ONLY playable at 1080P at good fps without Hairworks unless one is packing 970 SLI, 980 SLI or Titan X/Titan X SLI.

With Hairworks aka AMD Cripple-works:

Titan X = 49 fps
980 = 40.2 fps
970 = 33.7 fps
780Ti = 28.5 fps
290X = 28.2 fps
290 = 26.8 fps
780 = 24.1 fps
770 = 20.9 fps
280X = 20.3 fps
285 = 19.4 fps
http://www.computerbase.de/2015-05/...diagramm-grafikkarten-benchmarks-in-1920-1080

^^^ Who wants to play at those frame per second at 1080P? :D
LOL at those clocks.
mine gtx970 stock is 1367Mhz and after oc to 1500/8000 there is another 20% performance increase from that 1367Mhz.
Why they test gtx970 on such low Mhz?1266Mhz for OC card and 1184mhz boost for reference?Really?
Edit:same with 290/290x underclock like hell.
This also confirms why in some reviews is GTX970 like 25-30% slower than GTX980.They probably keep 970 on base-1180Mhz and their 980 boost on 1260Mhz+

I think best test is still from pcgames hardware where both 970/980 use same clocks + they use non reference 290/290x and 780TI.
http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
That's different. During those eras, GPU performance increased at the rate of 75-100% every 18-24 months, like clock work. No one who bought $500+ cards held on to a GPU for more than 2 years during those days. Everything you bought was obsolete in a matter of 12 months. That's why no oen cared that X850 XT PE didn't have SM3.0 support because we knew we would be moving on to X1800/1900 series or NV equivalent.

Also, if you pay close attention, Kepler bombs in GameWorks games but it performs well in almost all AMD Gaming Evolved games and very well in vendor agnostic games. If Kepler showed bad performance in almost all modern games, then you'd have a point.

In today's gaming environment, it's not normal for a $700-1000 flagship cards like 780Ti/Titan to be barely faster than a $299 R9 280X and lose badly to a $330 970. There was 2 times when NV's architecture destroyed their cards' performance - DX9 games with all GeForce 5s and next gen shader intensive games with GeForce 7. However, it's hard to pinpoint Kepler's particular weakness since like others noted, this primarilly shows up in GameWorks titles.

Its weakness is compute. Performance bombed in Tomb Raider with TressFX, but now Maxwell does ok. It even suffers a bit under Civ 5's really simple compute benchmark.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I recorded a video with ShadowPlay. My overclocked GTX 780 running the game at Ultra settings at 1080p with No Hairworks enabled.

https://www.youtube.com/watch?v=ahL3B6MDtXQ

Frame rate is in the 40-50+ range.

Quite a bit higher vs. what's shown here.

Without HairWorks:

Titan X = 59.5 fps
980 = 49.1 fps
290X = 44.1 fps
290 = 41.2 fps
970 = 40 fps
780Ti = 35.4 fps
780 = 29.6 fps
285 = 28.2 fps
280X = 28.1 fps
http://www.computerbase.de/2015-05/w...s-in-1920-1080

I later realized that i'm running the older drivers as well (350.12). Time to update those. I assume it should only help performance.
 
Last edited:
Feb 19, 2009
10,457
10
76
Its weakness is compute. Performance bombed in Tomb Raider with TressFX, but now Maxwell does ok. It even suffers a bit under Civ 5's really simple compute benchmark.

Performance of Kepler suffered in TressFX for about two weeks til NV updated their drivers to optimize Kepler for it.

13632141234v2TkTbPdM_6_6.jpg


http://www.hardocp.com/article/2013...deo_card_performance_iq_review/6#.VVsAI_mqpBc

Note the DATE of the article. March 20, 2013.

Release date of Tomb Raider with TressFX?

http://store.steampowered.com/app/203160/

March 4, 2013. [H] managed to get the drivers and ran through their tests article write-up in a few weeks.

Kepler tanking these days in NV sponsored games, it's simply a lack of optimizations.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
The fact that 780 Ti SLI is performing so poorly just proves that Kepler does not have the same priority for driver optimizations. I'm sure they will optimize for Kepler eventually but it's ridiculous that they literally did no optimizations at launch and continue to do so in these GameWorks titles.

I hope all this commotion will wake up Nvidia. Customer support after the sale is extremely important to ensure future sales.

It would be nice if we could see some real investigative journalism from the tech press...
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I'm sure they will optimize for Kepler eventually

I wouldn't be so sure... unless the press makes another GTX 970 3.5+.5 level stink about it, it'll be below mainstream notice and therefore not important enough to merit investment
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Its weakness is compute. Performance bombed in Tomb Raider with TressFX, but now Maxwell does ok. It even suffers a bit under Civ 5's really simple compute benchmark.

Compute was a weakness of the small Keplers, however the 780/780Ti do not fall under this. They have respectable compute, yet they still have poor performance.

Also TressFX only sucked before nVidia updated the drivers, then they were fine and today AMD and nVidia cards perform roughly the same in Tomb Raider.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Off-topic but wanted to address your point:

Good catch. I didn't even notice that. I provided AT's measurements too as a 2nd point of reference. TR did everything possible in their reviews to recommend 960 over 290/280X and consistently downplayed the existence of after-market HD7950/7970/R9 290/290X cards since Tahiti XT/Hawaii XT launched. Also, they wouldn't stop their FCAT testing of HD7000 series to prove how much CF sucked but I haven't seen a single FCAT testing on their site since Maxwell launched.

IMO, Tech Report's reputation was shot already when they showed double standards for FCAT testing with how they approached NV vs. AMD setups. That's why I use a lot of European sites now as they are less likely to play favourites and don't get invited to NV corporate events/outings or get showered with gifts.

A while back Silverforce11 basically showed a video where how the North American reviewers get treated by NV at corporate events.

I had been kind of wondering why nobody showed FCAT scores anymore, but you pretty much nailed it. Which is a very sad thing :(
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
The fact that 780 Ti SLI is performing so poorly just proves that Kepler does not have the same priority for driver optimizations. I'm sure they will optimize for Kepler eventually but it's ridiculous that they literally did no optimizations at launch and continue to do so in these GameWorks titles.

I hope all this commotion will wake up Nvidia. Customer support after the sale is extremely important to ensure future sales.

It would be nice if we could see some real investigative journalism from the tech press...

I believe Nvidia are looking into the issue for Kepler:
https://forums.geforce.com/default/...he-witcher-3-wild-hunt-/post/4533205/#4533205
 

SimianR

Senior member
Mar 10, 2011
609
16
81
So when I thought it was weird that the 780 Ti a $699 card only last September is performing between a 960 (a $200 card) and a 970 (a $330 card) I might have been right? Gee, the guys that kept hitting me over the head with "all high end parts are a bad investment, this is completely normal" and "the 780 Ti is obsolete, move on" were so sure. I'm sorry but high end parts usually don't sink this badly so quickly. Either NVIDIA stopped optimizing for Kepler or the architecture really wasn't forward thinking - maybe a combination of both.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
The NV guy claims they are looking into it, as if its something they were not aware of? Surely they test a big game release like this with more than just 900 series cards.
the kepler owners are pissed, so they are doing dmg control :)
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
the kepler owners are pissed, so they are doing dmg control :)

I too am pissed, Spent about £400 on an Asus 780 DCUII thinking like most people at the time the next series wouldn't arrive until after christmas at least. 2 months later the 970 and 980 appeared.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Compute was a weakness of the small Keplers, however the 780/780Ti do not fall under this. They have respectable compute, yet they still have poor performance.

Also TressFX only sucked before nVidia updated the drivers, then they were fine and today AMD and nVidia cards perform roughly the same in Tomb Raider.

hairworks really is not that hard on compute imo. Its the over tessellation and antialiasing.

AMD actually addressed this game, I wasn't aware. Hairworks seems like utter rubbish and AMD was thinking nvidia would do well so they never bothered offering tressfx

http://arstechnica.co.uk/gaming/201...s-completely-sabotaged-witcher-3-performance/

I asked AMD's chief gaming scientist Richard Huddy, a vocal critic of Nvidia's GameWorks technology, about AMD's involvement with CD Projekt Red, and the support it had reportedly failed to provide to the developer: "That's an unfortunate view, but that doesn't follow from what we're seeing," said Huddy. "We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that...it's wrecked our performance, almost as if it was put in to achieve that goal."

Ultimately, though, there's an additional amount of time and cost attached to including two very different types of technology to produce largely the same effect. According to AMD's Huddy, the company "specifically asked" CD Projekt Red if it wanted to put TressFX in the game following the revelation that HairWorks was causing such a large drop in performance, but apparently the developer said 'it was too late.'"

In addition to saying that HairWorks was causing "contrived damage to AMD," Huddy was particularly damning of the fundamental technology behind HairWorks, saying that it's "spectacularly inefficient on both bits of hardware." Huddy may very well be right on that point. The first wave of Witcher 3 benchmarks have started to trickle in, and Nvidia's HairWorks technology is proving to be quite the performance hog. Over at German site Hardwareluxx, they found that turning on HairWorks dropped the frame rate on a GTX 980 down from 87.4 FPS to 62.2 FPS, a performance hit of around 30%. The situation was far worse for the R9 290X, which took a gargantuan hit from 75.8 FPS to 29.4 FPS.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
Holy cow! Nvidia really don't give a crap about Kepler. I can't believe Nvidia is willing to put Kepler under the bus to make AMD look bad. GameWorks FTW