• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

What do you think the Witcher 3 will take to run ?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
Dual OCed 290s and you are just enjoying "most of the visuals" come on man crank that shxx up LOL.

I said:

So one 290 is good, plenty enough to enjoy the game with most of the visuals cranked up.

A lot of people don't have the cash for a xfire 290 setup, and a single 290 is fine for TW3 @1440p, with most settings on high.
 

I/O

Banned
Aug 5, 2014
140
0
0
I said:



A lot of people don't have the cash for a xfire 290 setup, and a single 290 is fine for TW3 @1440p, with most settings on high.

Dual OCed 290s .... "wipes drool from chin" LOL. That's some great performance from one 290 for sure.
 

I/O

Banned
Aug 5, 2014
140
0
0
Hey guy's thanx for making this thread a success. I enjoy good conversation about gaming and graphics cards etc ;-)
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
Yup, and that's assuming it doesn't get delayed 1 more time. Really I think it's way too early to discuss upgrade path for Witcher 3. 7 months is an eternity in the GPU space especially when this generation is close to the end. In that time we'll have Classified/Lighting 880s and maybe even R9 390/390X. Those who will be upgrading to 880 this fall are not going to do it for Witcher 3 specifically but probably because their 2-4 year upgrade cycle is around the time 880 launches (say 7950/7970/570/580/670/680 users). The gamer who will be upgrading almost exclusively for Witcher 3 will wait as long as possible since games do get delayed and in 7 months GPUs will only get cheaper and/or faster.

Hawaii is a newer architecture and will be relevant for quite a while, that's great news considering you can grab reference 290s off ebay for <$250 ($30 more for non-reference with better coolers). That's what I did, bought a pair of reference 290s ($410 for both), put waterblocks on them and set them up on their own loop, then OC'd the s*** out of them. :D That's a hell of a lot of silent GPU power for the money.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Avg frames on my rig have gone up from mid-40s a couple months ago to mid-50s now, with a single OC'd 290 (set at 1440p and HQ settings), which really shows how well they've done with optimization. In xfire I get high 70s, and while I can tell some difference it isn't night and day, like in a demanding online FPS. So one 290 is good, plenty enough to enjoy the game with most of the visuals cranked up.

Oh, and yeah, it's f***ing gorgeous, too. :wub:

Wait a sec. If I'm understanding you properly, you're claiming that you have access to the Witcher 3 beta?

I'd have to see some proof to believe that. There is no Witcher 3 beta as far as I know, and actually, there were some fake websites devoted to such a thing a while ago which were taken down by CDPR..

Also, CDPR themselves said there will be no beta.

There is a beta for the Witcher Adventure game, but not the Witcher 3.
 

Ken145

Junior Member
Aug 1, 2014
17
0
0
Incorrect.
Even pre 3.0 PhysX has multithreading, ie excellently multithreaded Metro 2033. Newer versions have "automatic mt"
True, but developers don't bother to optimize physx effects for CPU, Metro LL runs terribly in CPU mode, it's not even saturating single core while runs in pretty much sub 10 or sub 20 fps when you trigger the particles or enter some area with fluids, and they stated themselves that all physx work was handled by nvidia guys who obviously don't prioritize working on non-gtx geforce user expirience. Same with 2033, only it did't have enough effects to significantly affect performance even in CPU mode.

An OCed 290 (1225 core/1325 RAM) runs the beta beautifully at 1440p on HQ settings.
What about VHQ settings and AA?
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
I think here people are getting bit to worried but the fact is that even Gtx 770 or R9 280X will run this games much better than So Called Next Gen console.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't expect my current rig in sig to have any difficulty playing the Witcher 3 at ultra quality settings @ 1440p and TSMAA, or SMAA with good performance..

And that's with PhysX on high too. :D I wonder what their rendering pipeline will look like? Will it support DX11 multithreading like Watch Dogs, or DX11.1 for better CPU performance?
 

I/O

Banned
Aug 5, 2014
140
0
0
True, but developers don't bother to optimize physx effects for CPU, Metro LL runs terribly in CPU mode, it's not even saturating single core while runs in pretty much sub 10 or sub 20 fps when you trigger the particles or enter some area with fluids, and they stated themselves that all physx work was handled by nvidia guys who obviously don't prioritize working on non-gtx geforce user expirience. Same with 2033, only it did't have enough effects to significantly affect performance even in CPU mode.


What about VHQ settings and AA?

nvidia purposely hobbles Physx effects on the CPU which obviously would stand to reason why. This has nothing to do with the game developers.
 

I/O

Banned
Aug 5, 2014
140
0
0
I don't expect my current rig in sig to have any difficulty playing the Witcher 3 at ultra quality settings @ 1440p and TSMAA, or SMAA with good performance..

And that's with PhysX on high too. :D I wonder what their rendering pipeline will look like? Will it support DX11 multithreading like Watch Dogs, or DX11.1 for better CPU performance?

Physx does not run properly on the CPU for clear reasons.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
nvidia purposely hobbles Physx effects on the CPU which obviously would stand to reason why. This has nothing to do with the game developers.

Seriously, you keep repeating the same nonsense over and over..

PhysX effects on the CPU aren't purposely hobbled. Witcher 3 will use PhysX version 3.3, or even 3.4, and all the low level stuff plus cloth and destruction (at default settings) will run on the CPU.

So if NVidia is purposely hobbling CPU PhysX, why has CDPR decided to run such critical physics components on the CPU?

Physx does not run properly on the CPU for clear reasons.

I use a dedicated GPU for PhysX :p But even so, the CPU will still be running all of the low level and destruction effects..
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
True, but developers don't bother to optimize physx effects for CPU, Metro LL runs terribly in CPU mode, it's not even saturating single core while runs in pretty much sub 10 or sub 20 fps when you trigger the particles or enter some area with fluids, and they stated themselves that all physx work was handled by nvidia guys who obviously don't prioritize working on non-gtx geforce user expirience. Same with 2033, only it did't have enough effects to significantly affect performance even in CPU mode.


What about VHQ settings and AA?

The CTO of 4A games:

PC Games Hardware: When benchmarking Metro 2033 we found out that the engine utilized more than four cores of multicore CPUs if we were using the advanced PhysX effects on CPU, so you are utilizing Nvidias PhysX SDK 3.x? Will all the advanced PhysX effects only be available in PC version?

Oles Shishkovtsov: That's the common misconception that PhysX 2.X cannot be multithreaded. Actually it is internally designed to be multithreaded! The only thing – it takes some programmer time to enable that multi-threading (actually task generation), mostly to integrate with engine task-model and ensure proper load-balancing. So, 2033 used PhysX 2.8.3, and Last Light uses similar, a slightly modified version at the time of writing. And yes, advanced PhysX effects will be available only on PC.

http://www.pcgameshardware.com/aid,...irectX-11-Tessellation-GPU-Physx-und-Co/News/
 

Ken145

Junior Member
Aug 1, 2014
17
0
0
He's talking about more generic stuff, they have physx middleware integrated deeply inside their engine, but what i'm talking about is how bad advanced physx effects like particles, fluids, etc. optimized for cpu/software mode. In more recent interview he's saying that you would need an extreme CPU for advanced physx in Metro LL, because they basically emulating highly parallel GPU stuff on weak serial CPU or something like that. Anyway it's not an excuse for how bad can advanced physx run on cpu even without significant actions on the screen.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The CTO offered the reason:

Some things are obscenely well parallelized and thus it is difficult to achieve the performance of a 2.0-TF-graphics card with a 0.1-TF-processor.
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
Wait a sec. If I'm understanding you properly, you're claiming that you have access to the Witcher 3 beta?

I'd have to see some proof to believe that. There is no Witcher 3 beta as far as I know, and actually, there were some fake websites devoted to such a thing a while ago which were taken down by CDPR..

Also, CDPR themselves said there will be no beta.

There is a beta for the Witcher Adventure game, but not the Witcher 3.

I don't care what you believe, it's been in closed beta for quite a while now.
 

I/O

Banned
Aug 5, 2014
140
0
0
Seriously, you keep repeating the same nonsense over and over..

No that is not the case. I am about to offer ... "FREE knowledge" to you.

PhysX effects on the CPU aren't purposely hobbled. Witcher 3 will use PhysX version 3.3, or even 3.4, and all the low level stuff plus cloth and destruction (at default settings) will run on the CPU.

If Physx are run on the CPU it does not mean they will run any good for if they did optimize Physx for the CPU then it would completely negate the necessity of GPU Physx which would have nvidia caught in a huge lie and the snake oil would be found out. Take BL2 for instance the Physx run like poop on the CPU along with Metro games and Batman games etc etc etc see the pattern here.

So if NVidia is purposely hobbling CPU PhysX, why has CDPR decided to run such critical physics components on the CPU?

nvidia has to set apart the Physx in some way between GPU and CPU otherwise it would not make sense from a marketing standpoint to have GPU accelerated Physx marketed as a ... "showstoping" selling feature highlight on their GPUs and it would no longer be an nvidia only feature that highlights and makes their hardware stand out from the crowd. SSE on modern CPUs is still a better instruction set to use for Physx than is old school X87 GPU Physx but nvidia invested into Physx a long time ago when CPUs were not strong enough to run the Physx via SSE instruction sets. Most people that clout the benefits of Physx do not even know how it works for if they did they would be enlightened to the fact that Physx is just a bunch of ... "Snake Oil".

I use a dedicated GPU for PhysX :p But even so, the CPU will still be running all of the low level and destruction effects..
You still play games that have Physx ?
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I don't care what you believe, it's been in closed beta for quite a while now.

You'd be a lot more believable if your AMD cards performed badly, and/or had high performing Nvidia cards in comparison :Colbert: :sneaky:
 

I/O

Banned
Aug 5, 2014
140
0
0
You'd be a lot more believable if your AMD cards performed badly, and/or had high performing Nvidia cards in comparison :Colbert: :sneaky:
That has squat to do with weather or not he actually has access to a ... "Closed Beta" for TW3.




Five accounts and counting? Really?



Please stay away this time.







esquared
Anandtech Forum Director
 
Last edited by a moderator:

Vapid Cabal

Member
Dec 2, 2013
170
10
81
Yup, and that's assuming it doesn't get delayed 1 more time. Really I think it's way too early to discuss upgrade path for Witcher 3. 7 months is an eternity in the GPU space especially when this generation is close to the end. In that time we'll have Classified/Lighting 880s and maybe even R9 390/390X. Those who will be upgrading to 880 this fall are not going to do it for Witcher 3 specifically but probably because their 2-4 year upgrade cycle is around the time 880 launches (say 7950/7970/570/580/670/680 users). The gamer who will be upgrading almost exclusively for Witcher 3 will wait as long as possible since games do get delayed and in 7 months GPUs will only get cheaper and/or faster.


Agreed. I ALMOST upgraded my xfire 7970's in anticipation...thank goodness I did not. My plan is to keep a nice little "Witcher 3 slush fund" and maybe upgrade AFTER trying it out on what I already have :)
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't care what you believe, it's been in closed beta for quite a while now.

And I don't care what you say.. People can claim anything, especially on the internet..

I just wanted to point out the discrepancy between what you say, and what CDPR said concerning the Witcher 3 beta, or lack thereof.