• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

What do you think the Witcher 3 will take to run ?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If Physx are run on the CPU it does not mean they will run any good for if they did optimize Physx for the CPU then it would completely negate the necessity of GPU Physx which would have nvidia caught in a huge lie and the snake oil would be found out.

So basically you're claiming that a CPU is just as powerful as a GPU for physics? You must be trolling me..

Take BL2 for instance the Physx run like poop on the CPU along with Metro games and Batman games etc etc etc see the pattern here.
These are all older titles that didn't use PhysX 3.3, which the Witcher 3 will be using. PhysX 3.3 has extensive SIMD and multithreading optimizations (which are mandatory), and is MUCH faster than those older PhysX SDKs..

Fast enough to do very good cloth simulation in fact..

nvidia has to set apart the Physx in some way between GPU and CPU otherwise it would not make sense from a marketing standpoint to have GPU accelerated Physx marketed as a ... "showstoping" selling feature highlight on their GPUs and it would no longer be an nvidia only feature that highlights and makes their hardware stand out from the crowd.
Hardware accelerated Physics is always going to be more powerful than software physics. My dedicated GTX 650 Ti PhysX card has over 1 teraflop of computational power, and it cost me less than 100 dollars.

By the time 1 teraflop consumer CPUs become available (likely using 512 bit vectors and more than 8 cores) that can actually handle advanced physics simulations without massive slowdowns, they will not only cost a pretty penny, but GPUs will be able to offer far greater performance for a much lower cost..

There is definitely a future for hardware physics processing..

SSE on modern CPUs is still a better instruction set to use for Physx than is old school X87 GPU Physx but nvidia invested into Physx a long time ago when CPUs were not strong enough to run the Physx via SSE instruction sets. Most people that clout the benefits of Physx do not even know how it works for if they did they would be enlightened to the fact that Physx is just a bunch of ... "Snake Oil".
Novodex and Ageia were the ones that used x87 for PhysX on the CPU. NVidia completely overhauled the codebase by adding SSEn and explicit multithreading and released PhysX 3.0 in 2011, so you're years behind the times apparently.

You still play games that have Physx ?
Of course..
 

Aolish

Senior member
Jan 1, 2002
336
4
81
This is with the idea that that anything under 60FPS minimum with absolute max settings (including AA) is considered choking and unplayable, right?

Im imagining something much higher than of the current hardware specs the xb1 and ps4 has. For some reason I notice a lot of people seems to forget about the level of optimization that developers can get out of these consoles.
 

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
And I don't care what you say.. People can claim anything, especially on the internet..

I just wanted to point out the discrepancy between what you say, and what CDPR said concerning the Witcher 3 beta, or lack thereof.

There is no open beta, there will be no open beta. That's all that's been said. I'm not in an open beta.
 

Spjut

Senior member
Apr 9, 2011
932
162
106
I think here people are getting bit to worried but the fact is that even Gtx 770 or R9 280X will run this games much better than So Called Next Gen console.

Well, many don't have those GPUs.
And considering one of those GPUs alone is around the 300 USD price range, they bloody well should run games better than the PS4.
 

Arkade

Junior Member
Mar 29, 2015
9
0
0
I predict that an overclocked 970 with an overclocked i7 should run this on ultra settings with around 45-50 fps average.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I predict that an overclocked 970 with an overclocked i7 should run this on ultra settings with around 45-50 fps average.

No way. The developer already hinted 980 for recommended and then they defined recommended as VHQ at 1080P @ 30-35 fps, but not Ultra. For Ultra @ 60 fps with Uber sampling, I am guessing Titan X SLI.
 

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
No way. The developer already hinted 980 for recommended and then they defined recommended as VHQ at 1080P @ 30-35 fps, but not Ultra. For Ultra @ 60 fps with Uber sampling, I am guessing Titan X SLI.

The 30-35 fps number was the Ultra Settings performance back in January on a 780Ti.

Since then Gamestar magazine got to play the current build of the game two weeks ago and said a GTX 980 averaged 60fps on Ultra.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

alcoholbob

Diamond Member
May 24, 2005
6,389
468
126
Sounds good if true but I am reluctant to believe it's 60 fps on Ultra with Uber Sampling at 1080P. Of course Uber Sampling is not a requirement. On the positive side, the game could last 200+ hours to do everything!
http://www.dualshockers.com/2015/03...ut-the-ps4-version-could-last-over-200-hours/

Ubersampling at 1080p is just the same as running the game 4K or 200% resolution scale with Battlefield 4. I wouldn't consider it part of the core Ultra settings since its like enabling DSR.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The up-close textures on the character models look pretty good, much better than ACU.

witcher2.jpg


The amount of detail in the open world is tremendous. I'll believe it when I see benches of 980 hitting 60 fps at 1080P with every setting besides Uber Sampling maxed out. I have my doubts.

goKQppNpRda2.png


Having said that, for slower paced games like TW3, I feel 60 fps is absolutely not necessary for a good gaming experience. It's not an online FPS game. I would take every setting maxed out at 30 fps over Medium settings at 60 FPS for this style of game.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
The bigger question is did Nvidia kill Kepler in this game? If a 980 can hit 60FPS with Ultra, a 780 Ti should be more or less the same. And I still think if this was PC exclusive it'd be even better graphic wise.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I think it will run 60fps maxed @ 1080p with a 980/290X without any sort of hardware generated anti-aliasing. Once you enable MSAA/TXAA/SMAA 2x you're going to need 2 of those cards for 1080p and 2x Titan X for 2560x1600/1440.

This seems to be the norm for all Gameworks games. Also it's a certainty you're going to need at least 4GB of VRAM to run maxed textures without having a stutterfest :p
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Those textures and overall graphics don't impress me one bit. Seems we should be much further along in 2015. I want my Unreal kite demo looking games this year, tired of all these dumbed down console ports.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
I think it will run 60fps maxed @ 1080p with a 980/290X without any sort of hardware generated anti-aliasing. Once you enable MSAA/TXAA/SMAA 2x you're going to need 2 of those cards for 1080p and 2x Titan X for 2560x1600/1440.

This seems to be the norm for all Gameworks games. Also it's a certainty you're going to need at least 4GB of VRAM to run maxed textures without having a stutterfest :p

Hmm. Would a 7950 3GB GDDR5 @ 800/1250 have a chance, at 1080P, without any hardware AA?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The bigger question is did Nvidia kill Kepler in this game? If a 980 can hit 60FPS with Ultra, a 780 Ti should be more or less the same. And I still think if this was PC exclusive it'd be even better graphic wise.

I'm going to say, no, Kepler doesn't have the compute abilities of Maxwell, and when a game uses GW, they are using lots of compute.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I'm going to say, no, Kepler doesn't have the compute abilities of Maxwell, and when a game uses GW, they are using lots of compute.

Maxwell might have more powerful shaders, but Kepler has quite a lot more of them. A 780 has 2304 whereas a 970 has 1664. A 780 Ti has 2880 and a 980 has 2048. Theoretically this should even things out.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Those textures and overall graphics don't impress me one bit. Seems we should be much further along in 2015. I want my Unreal kite demo looking games this year, tired of all these dumbed down console ports.

Well right now it's looking like the best open world RPG of all time. It's unrealistic for a developer to make a $30-50 million videogame with 100-200 hours of gameplay and have it only for the PC. That might only possible in the early stages when you are trying to establish the name. The studio is way too large now to make a game exclusively for the PC and not reap the rewards of selling it on PS4/XB1 consoles.

As far as Unreal kite demo goes, I believe Tom Peterson said that demo ran at 30 fps on a single Titan X. There is only 1 character in the demo. That means to make a full game with such level of graphics with many NPCs and explosions/destructible environments, we would need a GPU 2-3X more powerful than the Titan X because UE4 doesn't support SLI/CF.

Don't forget that a lot of games coming out in 2015 were being designed in 2012-2013. At that time, the developers wouldn't have been targeting Titan X as a minimum requirement (which is what UE4 Kite demo has as a min req). The two firms that pushed PC graphics to the limits were Crytek with Crysis franchise and THQ with Metro franchise. Both of those companies struggled financially for years. Seems like Star Citizen will offer next gen PC gaming graphics but it's ways off. By the time it comes out in 2016, we could be months away from 14/16nm GPUs.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Maxwell might have more powerful shaders, but Kepler has quite a lot more of them. A 780 has 2304 whereas a 970 has 1664. A 780 Ti has 2880 and a 980 has 2048. Theoretically this should even things out.

No, theoretically, they don't even out. Assuming you've looked at any of the compute benchmarks from before Maxwell, until now. All the benchmarks show that Kepler did not do well in compute. It even showed up in compute heavy games.

You could see the writing on the wall way back in the early days of Kepler when you saw games like Metro 2033, which used a lot of compute.

This conspiracy theory is just the result of Nvidia having underestimated the need for compute in modern games. They designed their cards for what they thought would be needed, and made a poor prediction. Like AMD did with their early 5000 series when it came to tessellation.