Also be interesting to see how much video ram Witcher 3 needs in gaming.
Makes me feel a little bit better at blowing all that money on the ROG Swift monitor.I've got a single 780 Ti, which means I'm at even more of a detriment by playing at 2560x1440 rather than 1920x1080.
I finished Witcher 1 just two months back, the game was pushing 5GB of ram (total 6GB).
Personally, I trust the work of those guys.
Disappointed. You'd think they would target hexa-cores by now. The CPU requirements are copy pasted from Mordor (and why are they stuck on IVB, they couldn't slap in some Haswell FMA3 and AVX2 support?) although if a 770 is recommended at least they should be optimising around Kepler. From memory Witcher 2 had a fat old Day 1 patch that improved everything. Meh really.
Makes me feel a little bit better at blowing all that money on the ROG Swift monitor.I've got a single 780 Ti, which means I'm at even more of a detriment by playing at 2560x1440 rather than 1920x1080.
Yes, they are. The newer 285 destroys the flagship 290X by 69%, while 980 is 2.75-3X faster (32X-64X tessellation factor) than a 290X at tessellation. It's so bad that a 760 is faster than every card outside of 285 that AMD makes. AMD barely improved tessellation performance when moving from Tahiti XT to Hawaii. If a game uses a lot of tessellation, it's going to level current generation of GCN cards. Outside of AMD's 285, their tessellation level is barely at Fermi GTX580 level.
Being a GW title has nothing to do with the performance hit of MSAA. MSAA has always had a large performance hit, and deferred rendering engines only exacerbated that.Considering this will be a GW title, expect to see 50-100% performance hit with MSAA on, and pretty much no chance that R9 200 series and below will be faster than NV's cards.
That was found out to be a hoax. But anyway, anyone that uses 8x MSAA is clueless in my opinion..If the above is true, it's looking pretty reasonable. Given that Witcher 3 launches at the end of May, it's possible R9 300 and GM200 series will be out which should ensure the game is maxed out at 60 fps with 4xMSAA at 1080P. As expected 1440P and above users will need SLI/CF.
That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.Makes sense considering 780Ti is ~2.3X faster than a 7850. If I were upgrading now for Witcher 3, I'd choose NV, but my advice as always is to wait until the game is released as it's still 5 months away.
The RAM requirements always allow for other applications or programs (plus the OS itself) to be running, which is why they are so high. Even these newer games don't use over 3GB of RAM by themselves, but if you factor in the OS plus a browser or whatever, then you can easily have well over 6GB of RAM that is being used..As for CPU/RAM requirements, those are probably BS. What game runs faster/better with 8GB of RAM vs. 6GB?
That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.
Those CPU specs got me a bit puzzled. I'm actually wondering if there will be any significant difference in performance at 1080p between the 3770 and 2500, especially with an overclocked 2500. Will be interesting to see CPU benchmarks.
And what about the RAM; are we coming to a point in time where 16GB is now the new sweet-spot? Maybe it is already and I just haven't been keeping up to date?![]()
Awfully tempted to lure myself into a 4790/k because of TW3, I mean it's TW3 it's totally understandable to chase an utlimate system for it.
I'll have one of those CPU's in a cart by months end I bet.
Yeah, but tessmark on extreme is way over what any game would ever use, so it's not an accurate starting point to judge GCN's tessellation performance in actual games.
Being a GW title has nothing to do with the performance hit of MSAA. MSAA has always had a large performance hit, and deferred rendering engines only exacerbated that.
That was found out to be a hoax. But anyway, anyone that uses 8x MSAA is clueless in my opinion..
That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.
The RAM requirements always allow for other applications or programs (plus the OS itself) to be running, which is why they are so high. Even these newer games don't use over 3GB of RAM by themselves, but if you factor in the OS plus a browser or whatever, then you can easily have well over 6GB of RAM that is being used..
I get that tessmark uses tessellation to the extreme. The point is if any modern game uses A LOT of tessellation, it's going to translate to a massive performance hit for current GCN cards. Their tessellation performance is nearly 3X less than that of a 980. The reason we don't see a 290X being 3X slower than a 980 is because a game is not 100% tessellated like the NV tessellated City demo. As a result games are a combination of geometry, texture, shader, pixel, color fill-rate, memory bandwidth limited. Recent games are not tessellation heavy enough to expose the tessellation bottleneck in GCN cards above other bottlenecks such as textures, shaders, pixels, memory bandwidth. However, with enough tessellation, it's easily possible which is what my main point was. You made a claim that GCN is not weak in tessellation, but it is. What you meant to say that modern games aren't demanding enough in their tessellation use to show just how weak GCN is for tessellation vs. Kepler/Maxwell.
MSAA is a hardware based AA solution, so I don't think that code optimization really factors heavily into it's performance. Either the GPU has the resources to use it, or it doesn't.. What affects MSAA performance (other than hardware specs), are things like whether an engine uses deferred rendering, and also what kind of game it is.Normally in deferred game engines the MSAA performance hit is around 25-40% as seen in BF4. In GW titles, NV focuses on TXAA/FXAA while AMD has no access to the source code until after the game launches. As a result the performance hit is 50-100% on both camps as we've seen in recent titles since MSAA optimization of NV/AMD is nearly non-existent. GW itself doesn't cause MSAA performance to fall due to some tricks but it's indirect result of where the optimization focus ends up on GW titles that generally results in massive MSAA drops on both AMD and NV. Instead of the already large 25-40% hit, since neither NV nor AMD optimize for MSAA as much as in the past, the performance hit with MSAA has continued to increase over the years. If you look at MSAA performance hit in GW titles, it's more severe than other games that use deferred game engines. The MSAA performance hit in AC Unity and FC4 for example is out of this world.
To me, maxing out a game implies that all graphical options are turned on to their highest setting. But AA to me, is a bit of an exception. You always reach a point where diminishing returns make haphazardly increasing the level of MSAA to be foolish, and not to mention the horrendous performance hit.That's not the point. The point is "maxing out" Witcher games implies turning on all possible settings in the game to the max. Since Witcher 2 featured Uber-sampling, which itself is more demanding than 8xMSAA, then using 780Ti's reference point with 8xMSAA would tell us that it can't possibly max out Witcher 3 (aka with Uber-sampling mode on).
Here is one article that mentions they are targeting 900p for the Xbox One and 1080p for the PS4.Ok fair enough but if you have new info that contradicts older info, please post it as back-up. If Witcher 3 will run at 1080P @ 30 fps on High on a PS4 (7850 GPU), then it should fly on 970/290X/980.
Tons of people do that. The most frequently used program on any PC is the web browser. A lot of people leave their browser and all the tabs open during gaming because they don't want to close them. And these 64 bit browsers can use a lot of RAM if you have a lot of tabs open..1. Who buys high-end videocards and runs games with a bunch of other programs in the background? That's like buying a 980 and running your game at 290/970 level of performance because one is too lazy to close programs?
Yet, none of this changes the fact that it does nothing tangible for gaming performance, unless you are talking about measuring level loading times with a stop-watch or running RAM Disk with your excess RAM.
1. Who buys high-end videocards and runs games with a bunch of other programs in the background? That's like buying a 980 and running your game at 290/970 level of performance because one is too lazy to close programs?
I get that tessmark uses tessellation to the extreme. The point is if any modern game uses A LOT of tessellation, it's going to translate to a massive performance hit for current GCN cards. Their tessellation performance is nearly 3X less than that of a 980. The reason we don't see a 290X being 3X slower than a 980 is because a game is not 100% tessellated like the NV tessellated City demo. As a result games are a combination of geometry, texture, shader, pixel, color fill-rate, memory bandwidth limited. Recent games are not tessellation heavy enough to expose the tessellation bottleneck in GCN cards above other bottlenecks such as textures, shaders, pixels, memory bandwidth. However, with enough tessellation, it's easily possible which is what my main point was. You made a claim that GCN is not weak in tessellation, but it is. What you meant to say that modern games aren't demanding enough in their tessellation use to show just how weak GCN is for tessellation vs. Kepler/Maxwell.
I don't believe you. What resolution where you running at?
1440P / Max settings - GTX770, but maybe the bunch of Internet Explorer tabs were running did add to it.
The game even did crash few times, I suspected running out of ram.
Those CPU specs got me a bit puzzled. I'm actually wondering if there will be any significant difference in performance at 1080p between the 3770 and 2500, especially with an overclocked 2500. Will be interesting to see CPU benchmarks.
And what about the RAM; are we coming to a point in time where 16GB is now the new sweet-spot? Maybe it is already and I just haven't been keeping up to date?![]()
Yeah, I'm highly skeptical a game from 2007 running on the same engine as the original Neverwinter Nights would use that much memory. If anything, the game probably isn't large address aware and ran out of memory supported by the engine, not the total memory on your graphics card/system.
I build my new system with this game in mind then they moved the release date!!
CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Swiftech H240-X 90.0 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD3H-BK ATX LGA1150 Motherboard
Memory: G.Skill Ripjaws X Series 16GB (2 x 8GB) DDR3-2133 Memory
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Case: Corsair 550D ATX Mid Tower Case
Monitor: Perfect Pixel QNIX QX2710 LED Evolution ll [Glossy] 27" 2560x1440 PLS PC Monitor
Games are not using over 8 gigs of RAM yet, if your ram monitor maxes out or if you have 16 gigs and your ram monitor goes over it's most likely the pagefile pushing it past the 8 gigs.
I build my new system with this game in mind then they moved the release date!!
CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Swiftech H240-X 90.0 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD3H-BK ATX LGA1150 Motherboard
Memory: G.Skill Ripjaws X Series 16GB (2 x 8GB) DDR3-2133 Memory
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Case: Corsair 550D ATX Mid Tower Case
Monitor: Perfect Pixel QNIX QX2710 LED Evolution ll [Glossy] 27" 2560x1440 PLS PC Monitor
Is it so important to play a game on day 1 that you can't simply order the system after the game comes out?