[GameGPU] The Witcher 2 Enhanced Edition - Retro GPU/CPU test

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Uber-Sampling is enabled in all tests.

Very impressive SLI and CF scaling from both camps! Hopefully TW3 is like this. Also, we can use this as a good reference point to see how demanding TW3 is vs. TW2 this month.

http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-cach-w2_1920.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-cach-w2_2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-cach-w2_3840.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-cach-w2_proz.jpg
 
Last edited:
Feb 19, 2009
10,457
10
76
I'm hoping Projeckt Red optimize for both AMD & NV like Rockstar did for GTA V. Chances are good given TW1/2 run very well on both camps (except for AMD CPUs, nothing can help those heh).
 

thetuna

Member
Nov 14, 2010
128
1
81
If I remember correctly, Übersampling came with a massive performance hit and questionable IQ gains. (Something about rendering the scene multiple times and compositing?)
 

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
If I remember correctly, Übersampling came with a massive performance hit and questionable IQ gains. (Something about rendering the scene multiple times and compositing?)

It's like using DSR, multiplies the resolution. So if you use Ubersampling at 1080p you're effectively rendering at 4K.

It gives you pristine IQ but the performance hit is enormous yes.
 

psolord

Golden Member
Sep 16, 2009
1,931
1,194
136
Very interesting stuff, thanks.

If I am allowed to chime in with my own benchmarks, although I have not quite finished my Witcher 2 EE benchmarks on all my systems, but at least I have my GTX 970 ones ready.

Ultra settings, minus Ubersampling for me, because I have systems that will not handle it at all, but at least these settings would better expose cpu limitations.

So.....(spicy wallpaper alert)

Witcher 2 1920x1080 Custom Ultra GTX 970 @1.5Ghz Core i5 2500k @4.8GHz - 118fps

Witcher 2 1920X1080 custom ultra GTX 970 @1.5Ghz CORE i7-860 @4GHz - 112fps

Witcher 2 1920X1080 Custom Ultra GTX 970 @1.5Ghz Q9550 @4GHz - 97fps


There are a few battles in this benchmark and it's outdoors, so I was surprised that all systems managed such good performance. Especially the Q9550, which as was expected, presented quite a few cpu limits at parts, as shown at the end of the video in MSI Afterburner graph, but still very very playable.

I wonder how Withcer 3 will be on both cpu and gpu side, but mostly the cpu side, because on the gpu I expect at least 2X more gpu load.

PS It just occured to me that I have never seen Withcer 2 running on my 7950s, lol. I am right on it!
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
I personally think that DSR and VSR make Ubersampling obsolete. They give a lot more control and allow for less of a performance hit by not going 4x.
 

amenx

Diamond Member
Dec 17, 2004
3,950
2,189
136
See how kepler doing much better on older games? Becoming more apparent that Nv is not as focused on optimizing it in newer titles.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
See how kepler doing much better on older games? Becoming more apparent that Nv is not as focused on optimizing it in newer titles.

We should be careful when comparing old games. TW2 was just a DX9 game and Kepler wasn't even released back then.

Maxwell has certain advantages over Kepler. For some new games at least, it might just be that Nvidia encourages using those features, despite it becoming a disadvantage for Kepler.

I personally wouldn't be surprised if Nvidia has stopped releasing performance drivers for Kepler, but I do hope that isn't the case. But I do think that the users should consider there can be other reasons for Kepler's disadvantages in newer games.
 

Makaveli

Diamond Member
Feb 8, 2002
4,724
1,061
136
I'm still amazed and how much the 280x/7970Ghz has been able to close the gap with the 780 card.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
I'm still amazed and how much the 280x/7970Ghz has been able to close the gap with the 780 card.

Heh, amazed... thats the word... yea... :(... :'(

<--- sold my 7950 to buy a 780 a year ago.

To be fair, the 7950 sold for $300 and the 780 was only $365 but still the 780 looks more and more lame with every benchmark I see which is almost the opposite of what I expected given how beefy the specs are/were.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Heh, amazed... thats the word... yea... :(... :'(

<--- sold my 7950 to buy a 780 a year ago.

To be fair, the 7950 sold for $300 and the 780 was only $365 but still the 780 looks more and more lame with every benchmark I see which is almost the opposite of what I expected given how beefy the specs are/were.

In some ways the 780 is similar to a 7950. The card is meh at stock speeds but has 30-40% overclocking headroom. I believe the reference 780's boost is 900mhz but I've seen 780 OC to 1200-1250mhz+. Once 780 is overclocked, it should approach a reference 290X/780Ti in performance. 7950 and 780 were 2 of those cards (along with a 460/470) that really show much improved performance with overclocking. Considering 780 is still 43% faster than a 7950 in those charts, for $65 you got a smoking deal. Also, in other games 780 leads by much more.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Uber-Sampling is enabled in all tests.

Very impressive SLI and CF scaling from both camps!

this is DX9 game, so I would assume frame pacing with the AMD cards is broken, and the results should be ignored?

There are a few battles in this benchmark and it's outdoors, so I was surprised that all systems managed such good performance. Especially the Q9550, which as was expected, presented quite a few cpu limits at parts, as shown at the end of the video in MSI Afterburner graph, but still very very playable.

playing the game, one of the most demanding parts (for the CPU) I can remember was this one
https://www.youtube.com/watch?v=w_ydZOu2Tk8&feature=youtu.be&t=8m40s
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
If I remember correctly, Übersampling came with a massive performance hit and questionable IQ gains. (Something about rendering the scene multiple times and compositing?)

The ubersampling numbers are basically exactly the same as 4K No AA benchmarks.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
I have no recollection of that part.

Is that specific on the EE version?

Or maybe it was available on a different path I took?!

that's part of the original game, I've played a few years back when it was new and I did both choices, I think this one is when you side with the "rebels" or whatever it was called,

I think another place to test, which was also pretty hard with the slow CPUs I was using, is when you arrive in Flotsam and there is the whole execution thing going on... but again, I've played it a long time ago, using very slow CPUs, and the battle on the video I posted was the worst part I can remember, the last time I tested it was using my old i3 2100 and it was running at 24FPS constant during that battle (with very low GPU usage)
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
So Ubersampling murders CPUs too? If a 5960X can't get 60FPS minimums er . . . . .
 

psolord

Golden Member
Sep 16, 2009
1,931
1,194
136
that's part of the original game, I've played a few years back when it was new and I did both choices, I think this one is when you side with the "rebels" or whatever it was called,

I think another place to test, which was also pretty hard with the slow CPUs I was using, is when you arrive in Flotsam and there is the whole execution thing going on... but again, I've played it a long time ago, using very slow CPUs, and the battle on the video I posted was the worst part I can remember, the last time I tested it was using my old i3 2100 and it was running at 24FPS constant during that battle (with very low GPU usage)

I see.

I played and finished it on my 860@4Ghz and 570 sli at the time and it was a great experience. Cannot recall many slow downs.

New things are coming ahead. Interesting times!

Thanks for the info. :)
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
this is DX9 game, so I would assume frame pacing with the AMD cards is broken, and the results should be ignored?

This would be true for older cards like the 7970/7990. But for the 290/290x/295x2 this is not the case to my knowledge.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
I'm still amazed and how much the 280x/7970Ghz has been able to close the gap with the 780 card.

An HD 7970 Ghz Edition is 25-26% slower than a GTX 780 in that game. An R9 280X is 11-14% slower (at 1080p or 1440p) How's that amazing? I paid $270 for a new PNY GTX 780 CC which at the time was only a little more than an R9 280X (like $20). I'm fine with that purchase.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
An HD 7970 Ghz Edition is 25-26% slower than a GTX 780 in that game. An R9 280X is 11-14% slower (at 1080p or 1440p) How's that amazing?

What you just typed isn't even logical.

1. There is no HD7970Ghz on GameGPU's charts in this title. 7970 on there is the 925mhz version.

2. HD7970Ghz > 280X so your statement above cannot be true unless a factory overclocked 280X is used against a 7970Ghz.

You can see it benchmarks all over the net that 7970Ghz > R9 280X.

http://www.sweclockers.com/image/diagram/9433?k=5a2290db8f5ad9f560e38109f68d3112

I paid $270 for a new PNY GTX 780 CC which at the time was only a little more than an R9 280X (like $20). I'm fine with that purchase.

Ya, that's a good deal. However, everyone who paid $500+ for a 780 wasted money. Today the card gets killed by R9 290/290X.

780 is now only 9% faster than 7970Ghz at 1080P. It's performance delta at launch was far greater. If you play a lot of Blizzard titles, the upgrade was worth it for you, especially since the upgrade cost you so little. To really shine, 780 needs to be OC today. Its stock performance is mediocre for how long the card remained at $500-650 levels.
 

Makaveli

Diamond Member
Feb 8, 2002
4,724
1,061
136
What you just typed isn't even logical.

1. There is no HD7970Ghz on GameGPU's charts in this title. 7970 on there is the 925mhz version.

2. HD7970Ghz > 280X so your statement above cannot be true unless a factory overclocked 280X is used against a 7970Ghz.

You can see it benchmarks all over the net that 7970Ghz > R9 280X.

http://www.sweclockers.com/image/diagram/9433?k=5a2290db8f5ad9f560e38109f68d3112



Ya, that's a good deal. However, everyone who paid $500+ for a 780 wasted money. Today the card gets killed by R9 290/290X.

780 is now only 9% faster than 7970Ghz at 1080P. It's performance delta at launch was far greater. If you play a lot of Blizzard titles, the upgrade was worth it for you, especially since the upgrade cost you so little. To really shine, 780 needs to be OC today. Its stock performance is mediocre for how long the card remained at $500-650 levels.

Thanks did all the work for me :)

I'm thinking he was looking at 7870Ghz numbers and maybe got it confused.