Witcher 3 PC specs announced!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Omar F1

Senior member
Sep 29, 2009
491
8
76
Also be interesting to see how much video ram Witcher 3 needs in gaming.

I finished Witcher 1 just two months back, the game was pushing 5GB of ram (total 6GB).
Personally, I trust the work of those guys.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
Makes me feel a little bit better at blowing all that money on the ROG Swift monitor. :p I've got a single 780 Ti, which means I'm at even more of a detriment by playing at 2560x1440 rather than 1920x1080.

But it has G-Sync, so you'll live. :p
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
I'm noticing more i7-3770 references in the reccomended requirements of new AAA games.

In benchmarks HT isn't showing much imporvement for a lot of these games, so i'm leaning towards just fast quad core.

I did see DA:I benefit greatly from HT in one benchmark, despite a majority of benchmarks showing little benefit.


Awfully tempted to lure myself into a 4790/k because of TW3, I mean it's TW3 it's totally understandable to chase an utlimate system for it.

I'll have one of those CPU's in a cart by months end I bet.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Disappointed. You'd think they would target hexa-cores by now. The CPU requirements are copy pasted from Mordor (and why are they stuck on IVB, they couldn't slap in some Haswell FMA3 and AVX2 support?) although if a 770 is recommended at least they should be optimising around Kepler. From memory Witcher 2 had a fat old Day 1 patch that improved everything. Meh really.

Nope. They'll never target an enthusiast platform I don't know why you don't get that lol. When the i7-x770 come out as hexacores then you'll get more benefit out of it. For now, if you purchase the enthusiast platform you should be getting it for more than gaming if you want it to be worth the cash.

Makes me feel a little bit better at blowing all that money on the ROG Swift monitor. :p I've got a single 780 Ti, which means I'm at even more of a detriment by playing at 2560x1440 rather than 1920x1080.

Do you run at 8x MSAA though? Do people really use that high levels of AA? I don't really care to but maybe others do. I think you'll find the game more than acceptable no doubt.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yes, they are. The newer 285 destroys the flagship 290X by 69%, while 980 is 2.75-3X faster (32X-64X tessellation factor) than a 290X at tessellation. It's so bad that a 760 is faster than every card outside of 285 that AMD makes. AMD barely improved tessellation performance when moving from Tahiti XT to Hawaii. If a game uses a lot of tessellation, it's going to level current generation of GCN cards. Outside of AMD's 285, their tessellation level is barely at Fermi GTX580 level.

Yeah, but tessmark on extreme is way over what any game would ever use, so it's not an accurate starting point to judge GCN's tessellation performance in actual games.

Considering this will be a GW title, expect to see 50-100% performance hit with MSAA on, and pretty much no chance that R9 200 series and below will be faster than NV's cards.
Being a GW title has nothing to do with the performance hit of MSAA. MSAA has always had a large performance hit, and deferred rendering engines only exacerbated that.

If the above is true, it's looking pretty reasonable. Given that Witcher 3 launches at the end of May, it's possible R9 300 and GM200 series will be out which should ensure the game is maxed out at 60 fps with 4xMSAA at 1080P. As expected 1440P and above users will need SLI/CF.
That was found out to be a hoax. But anyway, anyone that uses 8x MSAA is clueless in my opinion..

Makes sense considering 780Ti is ~2.3X faster than a 7850. If I were upgrading now for Witcher 3, I'd choose NV, but my advice as always is to wait until the game is released as it's still 5 months away.
That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.

As for CPU/RAM requirements, those are probably BS. What game runs faster/better with 8GB of RAM vs. 6GB?
The RAM requirements always allow for other applications or programs (plus the OS itself) to be running, which is why they are so high. Even these newer games don't use over 3GB of RAM by themselves, but if you factor in the OS plus a browser or whatever, then you can easily have well over 6GB of RAM that is being used..
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.

That pretty much sounds like a match for how Dragon Age Inquisition compares between the XB1/PS4 versions and the PS4 version, then. I do expect TW3 to have better textures than DAI though.
 

Plimogz

Senior member
Oct 3, 2009
678
0
71
Those CPU specs got me a bit puzzled. I'm actually wondering if there will be any significant difference in performance at 1080p between the 3770 and 2500, especially with an overclocked 2500. Will be interesting to see CPU benchmarks.

And what about the RAM; are we coming to a point in time where 16GB is now the new sweet-spot? Maybe it is already and I just haven't been keeping up to date? :)

Yeah, the CPU requirements do seem a little wonky. I mean, the step up from a 3.3GHz i5 2500 to a 3.4Ghz i7 3770 is minuscule, if you remove HT from the equation. So do we infer that 8 threads will really bring an sizable improvement? That seems unlikely. As for the AMD recs -- it'll be a cold day in hell when a Phenom II 940 is the 'equal' of a 2500K. And then to put the 8350 in the recommended configuration, again 8 threads, but it could as easily be taken to mean: the fastest AMD chip you can lay your hands on; say a 4GHz piledriver, whatever.

However, if anybody's disposed to putting out a game/engine that makes use of that many cores/threads, CD Projekt has to be on that (short) list.

So I read those required/recommended CPU configurations as: 4 core sandy = good, 8 thread Ivy = better. 4 core Phenom II = ok-ish, 8 core Piledriver = Well, that's as fast a chip as those guys make, so whatever, that.

But oh how I would like to see their new engine scaling well above 4 threads/cores.

If for no better reason than to show up Linus.

Otherwise, if having more than quad threads has the usual negligible impact, I think it's safe to assume that an overclocked 2500K will still pretty much hang in there with latest and greatest, (if the latter is left stock, of course).
 
Last edited:

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Nice. Another game strikes 32 bit OSs entirely, a plus.

TW2 was the game that finally made me upgrade from the 4870, looks like TW3 will tempt me to upgrade to an R9 300 in the Spring.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
Anyone got 200 MHz I could borrow?
W2 runs fine maxed out @60+fps/1440p on the sig rig as long as I stay away from ubersampling, that cuts my frame rate in half whereas on better PC's turning it on has less impact.

If those gametech guys are correct, that's very good news. At 1440p I don't really need the most taxing AA options. Maybe I can squeeze by with this PC...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Awfully tempted to lure myself into a 4790/k because of TW3, I mean it's TW3 it's totally understandable to chase an utlimate system for it.

I'll have one of those CPU's in a cart by months end I bet.

You would be better off selling your existing CPU/GPU and going i5+970 or something similar.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah, but tessmark on extreme is way over what any game would ever use, so it's not an accurate starting point to judge GCN's tessellation performance in actual games.

I get that tessmark uses tessellation to the extreme. The point is if any modern game uses A LOT of tessellation, it's going to translate to a massive performance hit for current GCN cards. Their tessellation performance is nearly 3X less than that of a 980. The reason we don't see a 290X being 3X slower than a 980 is because a game is not 100% tessellated like the NV tessellated City demo. As a result games are a combination of geometry, texture, shader, pixel, color fill-rate, memory bandwidth limited. Recent games are not tessellation heavy enough to expose the tessellation bottleneck in GCN cards above other bottlenecks such as textures, shaders, pixels, memory bandwidth. However, with enough tessellation, it's easily possible which is what my main point was. You made a claim that GCN is not weak in tessellation, but it is. What you meant to say that modern games aren't demanding enough in their tessellation use to show just how weak GCN is for tessellation vs. Kepler/Maxwell.

Being a GW title has nothing to do with the performance hit of MSAA. MSAA has always had a large performance hit, and deferred rendering engines only exacerbated that.

Normally in deferred game engines the MSAA performance hit is around 25-40% as seen in BF4. In GW titles, NV focuses on TXAA/FXAA while AMD has no access to the source code until after the game launches. As a result the performance hit is 50-100% on both camps as we've seen in recent titles since MSAA optimization of NV/AMD is nearly non-existent. GW itself doesn't cause MSAA performance to fall due to some tricks but it's indirect result of where the optimization focus ends up on GW titles that generally results in massive MSAA drops on both AMD and NV. Instead of the already large 25-40% hit, since neither NV nor AMD optimize for MSAA as much as in the past, the performance hit with MSAA has continued to increase over the years. If you look at MSAA performance hit in GW titles, it's more severe than other games that use deferred game engines. The MSAA performance hit in AC Unity and FC4 for example is out of this world.

That was found out to be a hoax. But anyway, anyone that uses 8x MSAA is clueless in my opinion..

That's not the point. The point is "maxing out" Witcher games implies turning on all possible settings in the game to the max. Since Witcher 2 featured Uber-sampling, which itself is more demanding than 8xMSAA, then using 780Ti's reference point with 8xMSAA would tell us that it can't possibly max out Witcher 3 (aka with Uber-sampling mode on).

That was a hoax as well. Witcher 3 will be running at mostly high settings on both the PS4 and Xbox One, with some of the more demanding settings on medium if I had to guess. Also, Xbox One will likely have 900p, and PS4 will get 1080p, both at 30 FPS.

Ok fair enough but if you have new info that contradicts older info, please post it as back-up. If Witcher 3 will run at 1080P @ 30 fps on High on a PS4 (7850 GPU), then it should fly on 970/290X/980.

The RAM requirements always allow for other applications or programs (plus the OS itself) to be running, which is why they are so high. Even these newer games don't use over 3GB of RAM by themselves, but if you factor in the OS plus a browser or whatever, then you can easily have well over 6GB of RAM that is being used..

1. Who buys high-end videocards and runs games with a bunch of other programs in the background? That's like buying a 980 and running your game at 290/970 level of performance because one is too lazy to close programs?

2. Even if you run tasks in the background that are RAM limited, which you never should when gaming, there is no reputable review/site that ever showed more gaming performance/smoother gameplay with 16GB of RAM vs. 8GB of RAM in any PC game. If you have any info that contradicts this, please site it. The point is the vast majority of PC gamers buy 16GB of RAM because they think it's needed/future-proof but it's a complete waste for PC games and has been for 5 years that people have been buying 16GB of RAM. Since RAM is so cheap and some people run X79/99 platform, it's logical that a lot of gamers would buy 4x4GB kits or 2x8GB for quad-channel memory bandwidth/future expansion, respectively. Yet, none of this changes the fact that it does nothing tangible for gaming performance, unless you are talking about measuring level loading times with a stop-watch or running RAM Disk with your excess RAM.

Since Witcher 3 has now been delayed twice, with the new release date nearly a year from the original timeline, and since it's still 5 months away, anyone upgrading primarily for W3, should be waiting until May 19th and then possibly grab a much cheaper 970/980, much faster GM200/300 series and perhaps even a Broadwell i5/i7. For existing 280X/290/290X/780/780Ti/970/980 users, well those gamers knew a long time ago that Witcher 3 wouldn't be out until Feb 2015, then May 2015 so any purchases made for those GPUs were not done so primarily for Witcher 3. It's always risky to buy a GPU for a game so far out considering how quickly things change in the GPU space in terms of price/performance and new SKUs. By May 19, 2015, 970/980 will be 8 months old and 290/290X/780Ti will be 18 months old, which is nearly 75% of a typical GPU generation's lifespan! We shouldn't really be expecting any of these cards to max out a next gen game at 60 fps considering how old they will be when it launches and given that 970/980 are just mid-range next gen cards, not high-end.

Having said that, hopefully Witcher 3's technical graphics actually merit the high GPU requirements unlike FC4, Watch Dogs, Titanfall, AC Unity, The Evil Within, that generally looked average-great but not amazing, despite steep GPU/VRAM requirements. There have been some videos/screenshots that show downgraded Witcher 3 graphics. Fingers crossed it doesn't end up as yet another hyped up "next gen" PC game but really just a PS4/XB1 port with GW thrown in at the last minute (AC Unity tessellation patch MIA *cough*).
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I get that tessmark uses tessellation to the extreme. The point is if any modern game uses A LOT of tessellation, it's going to translate to a massive performance hit for current GCN cards. Their tessellation performance is nearly 3X less than that of a 980. The reason we don't see a 290X being 3X slower than a 980 is because a game is not 100% tessellated like the NV tessellated City demo. As a result games are a combination of geometry, texture, shader, pixel, color fill-rate, memory bandwidth limited. Recent games are not tessellation heavy enough to expose the tessellation bottleneck in GCN cards above other bottlenecks such as textures, shaders, pixels, memory bandwidth. However, with enough tessellation, it's easily possible which is what my main point was. You made a claim that GCN is not weak in tessellation, but it is. What you meant to say that modern games aren't demanding enough in their tessellation use to show just how weak GCN is for tessellation vs. Kepler/Maxwell.

Maybe you're right. It's too bad that the AC Unity tessellation patch isn't out yet. That would be a good indicator of the terrain tessellation they will use in the Witcher 3.

Normally in deferred game engines the MSAA performance hit is around 25-40% as seen in BF4. In GW titles, NV focuses on TXAA/FXAA while AMD has no access to the source code until after the game launches. As a result the performance hit is 50-100% on both camps as we've seen in recent titles since MSAA optimization of NV/AMD is nearly non-existent. GW itself doesn't cause MSAA performance to fall due to some tricks but it's indirect result of where the optimization focus ends up on GW titles that generally results in massive MSAA drops on both AMD and NV. Instead of the already large 25-40% hit, since neither NV nor AMD optimize for MSAA as much as in the past, the performance hit with MSAA has continued to increase over the years. If you look at MSAA performance hit in GW titles, it's more severe than other games that use deferred game engines. The MSAA performance hit in AC Unity and FC4 for example is out of this world.
MSAA is a hardware based AA solution, so I don't think that code optimization really factors heavily into it's performance. Either the GPU has the resources to use it, or it doesn't.. What affects MSAA performance (other than hardware specs), are things like whether an engine uses deferred rendering, and also what kind of game it is.

Games with lots of foliage for example will have a big performance hit when MSAA is turned on, because it's being applied to all the surfaces. Bigger and more complex games will also have a larger hit from turning MSAA on for the same reason.

Basically, the more stuff you have on screen, the greater the performance hit because more bandwidth is being used..

That's not the point. The point is "maxing out" Witcher games implies turning on all possible settings in the game to the max. Since Witcher 2 featured Uber-sampling, which itself is more demanding than 8xMSAA, then using 780Ti's reference point with 8xMSAA would tell us that it can't possibly max out Witcher 3 (aka with Uber-sampling mode on).
To me, maxing out a game implies that all graphical options are turned on to their highest setting. But AA to me, is a bit of an exception. You always reach a point where diminishing returns make haphazardly increasing the level of MSAA to be foolish, and not to mention the horrendous performance hit.

And shader based AA techniques offer 95% of the anti aliasing capability of MSAA, without the drawback..

Ok fair enough but if you have new info that contradicts older info, please post it as back-up. If Witcher 3 will run at 1080P @ 30 fps on High on a PS4 (7850 GPU), then it should fly on 970/290X/980.
Here is one article that mentions they are targeting 900p for the Xbox One and 1080p for the PS4.

But then a more recent article claims they were having issues hitting 1080p on the PS4.

So maybe it won't be 1080p after all. There's lots of tricks they can use to increase performance to hit 1080p, but at the expense of image quality. It may not be worth the trade off..

It's better for them to run the game at a lower resolution so they don't have to sacrifice things like shadow detail or ambient occlusion quality.

But I never said they would use all high settings on the consoles. I said mostly high, but with some medium as well.. And some stuff will be missing completely on the consoles, like the hairworks simulation, and advanced PhysX. Tessellation will also be very limited as well.

1. Who buys high-end videocards and runs games with a bunch of other programs in the background? That's like buying a 980 and running your game at 290/970 level of performance because one is too lazy to close programs?
Tons of people do that. The most frequently used program on any PC is the web browser. A lot of people leave their browser and all the tabs open during gaming because they don't want to close them. And these 64 bit browsers can use a lot of RAM if you have a lot of tabs open..

Yet, none of this changes the fact that it does nothing tangible for gaming performance, unless you are talking about measuring level loading times with a stop-watch or running RAM Disk with your excess RAM.

It's not about performance. It's about making sure that some idiots won't blame the game for making their PC crash due to running low on RAM..
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
1. Who buys high-end videocards and runs games with a bunch of other programs in the background? That's like buying a 980 and running your game at 290/970 level of performance because one is too lazy to close programs?

BF3/BF4 pretty much need the browser open if you want things to be convenient. BF3 you could close it if your happy with the server as Battlelog works without fail in that game but iv'e personally kept it open in BF4 cause of"Game disconnected:something went wrong" errors that pretty much make you launch the browser time and time again just so you can pick servers and play.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I get that tessmark uses tessellation to the extreme. The point is if any modern game uses A LOT of tessellation, it's going to translate to a massive performance hit for current GCN cards. Their tessellation performance is nearly 3X less than that of a 980. The reason we don't see a 290X being 3X slower than a 980 is because a game is not 100% tessellated like the NV tessellated City demo. As a result games are a combination of geometry, texture, shader, pixel, color fill-rate, memory bandwidth limited. Recent games are not tessellation heavy enough to expose the tessellation bottleneck in GCN cards above other bottlenecks such as textures, shaders, pixels, memory bandwidth. However, with enough tessellation, it's easily possible which is what my main point was. You made a claim that GCN is not weak in tessellation, but it is. What you meant to say that modern games aren't demanding enough in their tessellation use to show just how weak GCN is for tessellation vs. Kepler/Maxwell.

Well my original point was that GCN is not as handicapped in tessellation as AMD's older DX11 cards were. If you want to use Tessmark, just look at how AMD cards compare between each other:

http://www.anandtech.com/bench/GPU14/841

A 7770 comes out much better than a 6990. (but really, that's more of a reason to think that Tessmark is somehow bugged when it comes to running on AMD cards).

My ultimate point is that there's little precedent to think that TW3 will have so much tessellation that it will be the source of a significant performance difference between current AMD and Nvidia cards.
 

Omar F1

Senior member
Sep 29, 2009
491
8
76
:confused: I don't believe you. What resolution where you running at?

1440P / Max settings - GTX770, but maybe the bunch of Internet Explorer tabs were running did add to it.
The game even did crash few times, I suspected running out of ram.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
1440P / Max settings - GTX770, but maybe the bunch of Internet Explorer tabs were running did add to it.
The game even did crash few times, I suspected running out of ram.

Yeah, I'm highly skeptical a game from 2007 running on the same engine as the original Neverwinter Nights would use that much memory. If anything, the game probably isn't large address aware and ran out of memory supported by the engine, not the total memory on your graphics card/system.
 

D DoUrden

Member
Jul 19, 2012
47
0
0
Games are not using over 8 gigs of RAM yet, if your ram monitor maxes out or if you have 16 gigs and your ram monitor goes over it's most likely the pagefile pushing it past the 8 gigs.

I build my new system with this game in mind then they moved the release date!!

CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Swiftech H240-X 90.0 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD3H-BK ATX LGA1150 Motherboard
Memory: G.Skill Ripjaws X Series 16GB (2 x 8GB) DDR3-2133 Memory
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Case: Corsair 550D ATX Mid Tower Case
Monitor: Perfect Pixel QNIX QX2710 LED Evolution ll [Glossy] 27" 2560x1440 PLS PC Monitor
 
Last edited:

Xellos2099

Platinum Member
Mar 8, 2005
2,277
13
81
Those CPU specs got me a bit puzzled. I'm actually wondering if there will be any significant difference in performance at 1080p between the 3770 and 2500, especially with an overclocked 2500. Will be interesting to see CPU benchmarks.

And what about the RAM; are we coming to a point in time where 16GB is now the new sweet-spot? Maybe it is already and I just haven't been keeping up to date? :)

I think of the same thing. It made much more sense to made phenom II x4 940 as the minimum requirement and the i2500k as recommended
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah, I'm highly skeptical a game from 2007 running on the same engine as the original Neverwinter Nights would use that much memory. If anything, the game probably isn't large address aware and ran out of memory supported by the engine, not the total memory on your graphics card/system.

Same. Witcher 1 was recently retested by GameGPU (Sept 2014 testing).

http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_2560.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_3840.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_vram.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher-test-the_witcher_ram2.jpg


Source

There is no way TW1 uses 5GB of system memory by itself.

*Side-note* TW1 is a DX9 game and contrary to popular belief that GCN has issues in DX9 games, performance is good for both AMD and NV.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Well it may have good performance on GCN, that doesn't mean it can't other issues like texture flickering artifacts.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I build my new system with this game in mind then they moved the release date!!

CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Swiftech H240-X 90.0 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD3H-BK ATX LGA1150 Motherboard
Memory: G.Skill Ripjaws X Series 16GB (2 x 8GB) DDR3-2133 Memory
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Case: Corsair 550D ATX Mid Tower Case
Monitor: Perfect Pixel QNIX QX2710 LED Evolution ll [Glossy] 27" 2560x1440 PLS PC Monitor

You're going to be fine for TW3, even with a May release date. On the CPU front, I don't think the full Broadwell desktop parts are going to be widely available, and the only video card improvements would be OC'd GTX 9x0 parts and the R9 300 series. That top end tower today will still be a top end system this summer.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Games are not using over 8 gigs of RAM yet, if your ram monitor maxes out or if you have 16 gigs and your ram monitor goes over it's most likely the pagefile pushing it past the 8 gigs.

I build my new system with this game in mind then they moved the release date!!

CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Swiftech H240-X 90.0 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD3H-BK ATX LGA1150 Motherboard
Memory: G.Skill Ripjaws X Series 16GB (2 x 8GB) DDR3-2133 Memory
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Video Card: Asus GeForce GTX 970 4GB STRIX Video Card (2-Way SLI)
Case: Corsair 550D ATX Mid Tower Case
Monitor: Perfect Pixel QNIX QX2710 LED Evolution ll [Glossy] 27" 2560x1440 PLS PC Monitor

Is it so important to play a game on day 1 that you can't simply order the system after the game comes out?
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Is it so important to play a game on day 1 that you can't simply order the system after the game comes out?

Itches need to be scratched, and we all get the upgrade itch. For all we know, he had an extremely dated system before and wanted to upgrade regardless. TW3 would just have been a catalyst. Been watching people in /r/buildapc and /r/witcher do this the past month, but it happens before any big powerhouse eye candy game gets ready to launch.